Memory Error in pandas.concat with Python# DataSciences - 数据科学
y*0
1 楼
I am more recently working with 80 dataframes in python each of which holds
650K rows with the same 11 columns varying from string to number. My initial
goal is to concatenate all dataframes and make a general analysis but I keep
hitting the dead ends in doing so.
1. I started from pd.concat([df1,df2]) but ended up with memory error and
failed to figure out a solution after searching over internet.
2.I then, checking on some online suggestions, converted the number into
float32 for the reduction of memory burden and this time only found it not
helpful either.
3. If you are asking for a little background of my PC and python,I am using
a 32-bit Python with 2.7.5, pandas 0.13.0 and little chance to switch to 64
version.
Any comments, suggestions will be really appreciated.
650K rows with the same 11 columns varying from string to number. My initial
goal is to concatenate all dataframes and make a general analysis but I keep
hitting the dead ends in doing so.
1. I started from pd.concat([df1,df2]) but ended up with memory error and
failed to figure out a solution after searching over internet.
2.I then, checking on some online suggestions, converted the number into
float32 for the reduction of memory burden and this time only found it not
helpful either.
3. If you are asking for a little background of my PC and python,I am using
a 32-bit Python with 2.7.5, pandas 0.13.0 and little chance to switch to 64
version.
Any comments, suggestions will be really appreciated.