Redian新闻
>
[转载] help: linux 的 2G 文件大小限制
avatar
[转载] help: linux 的 2G 文件大小限制# Unix - 噫吁兮,危乎高哉
m*t
1
【 以下文字转载自 Linux 讨论区 】
【 原文由 maynot 所发表 】
从 co-worker 那拿到一个1.9G的压缩文件,(a.gz), 如果解压后,文件肯定超过
2G 的限制,有没有办法用 fortran/C 直接读取? 或者解压分割成几个可读的小
文件? 非常感谢.
avatar
z*w
2
Linux supports big file, unless your box is very old. I have many 4+GB
files. You probably can try extract it first.
Another way is to use "zlib" in C to read the file.
http://www.gzip.org/zlib/

【在 m****t 的大作中提到】
: 【 以下文字转载自 Linux 讨论区 】
: 【 原文由 maynot 所发表 】
: 从 co-worker 那拿到一个1.9G的压缩文件,(a.gz), 如果解压后,文件肯定超过
: 2G 的限制,有没有办法用 fortran/C 直接读取? 或者解压分割成几个可读的小
: 文件? 非常感谢.

avatar
e*i
3

agree. Even many years ago with the 2.0.x kernel, I can easily tar big files
(>2GB) then split them into 100M chunks on my 486.

【在 z*******w 的大作中提到】
: Linux supports big file, unless your box is very old. I have many 4+GB
: files. You probably can try extract it first.
: Another way is to use "zlib" in C to read the file.
: http://www.gzip.org/zlib/

avatar
m*t
4
you guys are right. since I am working on a RAID disk, which seems different.
Now I move it into the home directory and get no problem. this time, the
problem is fortran cannot handle such big file. I get no idea to figure it out.
any suggections? thanks a lot.

【在 e*i 的大作中提到】
:
: agree. Even many years ago with the 2.0.x kernel, I can easily tar big files
: (>2GB) then split them into 100M chunks on my 486.

相关阅读
logo
联系我们隐私协议©2024 redian.news
Redian新闻
Redian.news刊载任何文章,不代表同意其说法或描述,仅为提供更多信息,也不构成任何建议。文章信息的合法性及真实性由其作者负责,与Redian.news及其运营公司无关。欢迎投稿,如发现稿件侵权,或作者不愿在本网发表文章,请版权拥有者通知本网处理。