Using Download Helper With Extremely Large Files |
[eluser]nirbhab[/eluser]
great work, i am jealous of you :-) hope i wud have helped you with the same, well done, gud going....it seems that it will work.
[eluser]dawnerd[/eluser]
[quote author="nirbhab" date="1200755474"]great work, i am jealous of you :-) hope i wud have helped you with the same, well done, gud going....[/quote] You get all the credit man. You pointed me in the right direction. Hell, if it wasn't for you I'd be pulling my hair out right about now.
[eluser]bardelot[/eluser]
Maybe you want to check the "connection_status()" (http://phpfer.com/rn10re180.html) That's what I'm using. Code: ...
[eluser]BobbyB[/eluser]
Hi guys, I am also having the problem of big files not being downloaded because of memory issues. The "file_get_contents" tries to load the file into RAM and this makes it fail. I have been trying to solve this and also stumbeled upon the script from php.net but don't have the skills to implement it. Could you please share your finished script so I/we can use it? Thanks in advance. Cheers EDIT: I found just what I was looking for in this thread: Advanced File Downloading Library Thanks Mr.XtraFile! Keep it up!
[eluser]Unknown[/eluser]
[quote author="dawnerd" date="1200755009"]I did some reason on the readfile function and I found that through some sly programming, you can mimic the readfile function, but have it read the file in chucks, thus avoiding the max memory errors. Here's the code that was posted on php.net Code: <?php I'm sure I'm not the only one that could use this. Now, to get it to work in code igniter...[/quote] I congratulate you on your awesomeness. Thanks for posting this, it really helped me out. |
Welcome Guest, Not a member yet? Register Sign In |