scipy.optimize.curve_fit: not a proper array of floats error


I'm trying to use optimization.curve_fit to find the least square solution between two arrays, but I keep getting error: Result from function call is not a proper array of floats. I pasted my code below. Any ideas how to fix this? Thank you!

import numpy as np
import scipy.optimize as optimization

pcone = np.array([[-0.01043151],

pctwo = np.array([[0.02550008],

def func(x, a, b, c):
    return a   b*x   c*x*x

print optimization.curve_fit(func, pcone, pctwo)

Your arrays have shape (10, 1). That is, they are two-dimensional, with a trivial second dimension. In the simplest case, curve_fit expects one-dimensional arrays. Flatten pcone and pctwo into one-dimensional arrays before passing them to curve_fit.

For example, this works:

In [8]: curve_fit(func, pcone.ravel(), pctwo.ravel())
(array([ 0.05720879,  0.65281483, -2.67840575]),
 array([[  5.90887090e-04,   4.15822858e-02,   6.14439732e-01],
        [  4.15822858e-02,   4.07354227e 00,   6.94784914e 01],
        [  6.14439732e-01,   6.94784914e 01,   1.29240335e 03]]))

(You haven't shown how pcone and pctwo were created. It would probably be cleaner to create them as 1-D arrays in the first place, instead of flattening them later.)