Redian新闻
>
Vista sp1一点到视频文件就windows explorer has stopped working
avatar
w*i
2
Could someone tell me how to free memory of huge strings in python?
I have two huge strings R0 and R1 in python, why "R0 = R1" doesn't free the
memory of previous R0?
for example, this frees the memory of R0:
>>> R0 = open(file1).readlines()
>>> del R0
But the following doesn't free memory
>>> R0 = open(file1).readlines()
>>> R1 = open(file2).readlines()
>>> R0 = R1 # this doesn't free memory of file1.
>>> del R0
>>> del R1 # this doesn't free the memory either
what is the best way to free
avatar
m*1
3
然后restart, 怎么解决,看到好多人都这样
hp的dv2700
停掉了user account control还是不能解决
哪位有解决办法
avatar
r*t
4
R1 and R2 should be big lists, not strings, aren't they?
del obj 只是让 reference count minus 1,if reference count == 0, interpreter
will get around to call obj.__del__() and destroy the object. Even after the
obj is destroyed, 虽然它不存在了,但是它占的内存也不一定还给 OS 了。
big string obj > 256k uses malloc,这些内存 python interpreter 私藏起来 in case it will need that space again, instead of giving back to OS. This is not leak, because a next big string can reuse this memory again. Python interpreter keep space for 80 delet
avatar
w*i
5
right they should be lists. So even these lists take a huge ammount of
memory, they will be kept in memory for potential future reuse? This seems
not optimal since the probability that a huge list (consist of huge strings)
will be reused should be very low. In my case they actually eat up huge
ammount of memory. Is there any way to explicitly free the memory of these
lists?
thanks
avatar
r*t
6
It is the strings that're referenced by the list that are taking up the
memory, not the lists themselves.
我给你 copy 一段:http://effbot.org/pyfaq/why-doesnt-python-release-the-memory-when-i-delete-a-large-object.htm
“Exactly if and when Python’s allocator returns memory to the C runtime,
and when the C runtime returns memory to the operating system, depends on a
lot of parameters, including Python and library versions, your application’
s object allocation patterns, and so on. For example, CPython 2

【在 w****i 的大作中提到】
: right they should be lists. So even these lists take a huge ammount of
: memory, they will be kept in memory for potential future reuse? This seems
: not optimal since the probability that a huge list (consist of huge strings)
: will be reused should be very low. In my case they actually eat up huge
: ammount of memory. Is there any way to explicitly free the memory of these
: lists?
: thanks

相关阅读
logo
联系我们隐私协议©2024 redian.news
Redian新闻
Redian.news刊载任何文章,不代表同意其说法或描述,仅为提供更多信息,也不构成任何建议。文章信息的合法性及真实性由其作者负责,与Redian.news及其运营公司无关。欢迎投稿,如发现稿件侵权,或作者不愿在本网发表文章,请版权拥有者通知本网处理。