cURL, the omnipresent data tool, is getting a 25th birthday party this month

'When you first start messing with the command line, it can feel like there's an impermeable wall between the local space you're messing around in and the greater Internet. On your side, you've got your commands and files, and beyond the wall, there are servers, images, APIs, webpages, and more bits of useful, ever-changing data. One of the most popular ways through that wall has been cURL, or "client URL," which turns 25 this month. The cURL tool started as a way for programmer Daniel Stenberg to let Internet Chat Relay users quickly fetch currency exchange rates while still inside their chat window. As detailed in an archived history of the project, it was originally built off an existing command-line tool, httpget, built by Rafael Sagula. A 1.0 version was released in 1997, then changed names to urlget by 2.0, as it had added in GOPHER, FTP, and other protocols. By 1998, the tool could upload as well as download, and so version 4.0 was named cURL. Over the next few years, cURL grew to encompass nearly every Internet protocol, work with certificates and encryption, offer bindings for more than 50 languages, and be included in most Linux distributions and other systems. The cURL project now encompasses both the command-line command itself and the libcurl library. In 2020, the project's history estimated the command and library had been installed in more than 10 billion instances worldwide.' -- source: https://arstechnica.com/information-technology/2023/03/curl-the-omnipresent-... Cheers, Peter -- Peter Reutemann Dept. of Computer Science University of Waikato, Hamilton, NZ Mobile +64 22 190 2375 https://www.cs.waikato.ac.nz/~fracpete/ http://www.data-mining.co.nz/

On Mon, 13 Mar 2023 10:04:18 +1300, Peter Reutemann quoted:
'The cURL tool started as a way for programmer Daniel Stenberg to let Internet Chat Relay users quickly fetch currency exchange rates while still inside their chat window.'
Curl vs wget -- which one to use, and when? Curl seems better at HTTP form submissions. Wget can do bulk HTTP/FTP downloads, including mirroring of entire sites. Curl supports a much wider range of protocols, including SCP and SFTP, while Wget concentrates on HTTP/HTTPS and FTP. One minor thing: wget will use the “Last-Modified” HTTP header line, if specified, to set the last-modified date/time on the downloaded file. I can’t see any curl option to do this.
participants (2)
-
Lawrence D'Oliveiro
-
Peter Reutemann