Message boards : Web interfaces : cURL a solution for using fetch_files.sh
Message board moderation
Author | Message |
---|---|
![]() Send message Joined: 11 Oct 11 Posts: 58 ![]() |
Hello, maybe this was not a problem for anyone else but when I tried to use a variation of the fetch_files.sh script two web hosts said they don't allow wget. Probably since you can grab entire sites with wget and in a shared hosting environment that could be resource abuse issue. My script does some things differently but the basics are: (excerpt from original script) wget -N --tries=2 -nv http://abcathome.com/stats/team.gz (example using curl to grab the file) curl http://abcathome.com/stats/team.gz > team.gz I'm not a "curl expert" so maybe there are some options similar to the --tries=2 that I haven't explored yet but for a basic download function it's been working just fine. So if you are trying to implement these functions and find your host won't allow wget, try curl. If anyone knows of negatives to using curl for this function please let me know. Thanks. |
Copyright © 2025 University of California.
Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License,
Version 1.2 or any later version published by the Free Software Foundation.