pyspark calculate the average of numbers from a file

i have a CSV file that contains numbers

0,0.252353e+01
0,0.24743e-01
0,0.265852e+00
0,0.85346e-01
0,0.5223e+01
 ...

i want to get the avg of those numbers using pyspark & python

i have tried to use :

 numbers = lines.flatMap(lambda number: number)
 sumCount = numbers.combineByKey(lambda value: (value, 1),
                         lambda x, value: (x[0] + value, x[1] + 1),
                         lambda x, y: (x[0] + y[0], x[1] + y[1]))

averageByKey = sumCount.map(lambda (label, (value_sum, count)): (label,value_sum / count))

but it does not work..