wget for Mac OS X

Update: I’ve posted a new, updated version of wget for OS X which you may want to try instead.

If you want to grab files from the web using the command line, the wget utility is great.

Recent versions of Mac OS X don’t include it. They come with curl instead, which has some good features, but is also missing a great deal.

Here’s wget.zip, which contains wget built for Mac OS X 10.3.
Hope someone finds it useful!

Update: If you like this, you might also like my mtr for Mac OS X, or be interested in lots of other Apple-related stuff here.

Enjoyed this post? Why not sign up to receive Status-Q in your inbox?


Thanks for the binary. Wget is way better than safari for resuming downloads.

Thanks Q — saved me some SERIOUS time waiting for Perl’s CPAN to try to find things with FTP since I didn’t have wget.

Many thanks. Am still puzzled why make and make install didn’t work, using the GNU download. Any suggestions for future reference?

Steve – I think I just used the instructions
here and built from the latest CVS.
There may have been some recent fixes which make building on the Mac easier. But I didn’t do anything clever!

much appreciated, you the fella. wget for OSX is great, wget for OSX is good, it’s a delight. wget hooray! (ok i’m trying to get you more googlegoodness i admit it)

Thank you very much for providing this. I was a bit perplexed to find that it wasn’t included by default..

Thanks for this! Wget should come as default!

Thanks, very useful indeed. Much less of a hassle than figuring out how to use curl; it’s –help screen scared me and made me run screaming to google for ‘wget mac’ 🙂

Thanks for compiling this. I think I’ve got it running, but your instructions left me a bit perplexed. I’m running OS X 10.3.9, and I don’t seem to have the three target directories you mentioned in the Readme file: /usr/local/bin, /usr/local/man/man1, and /usr/local/etc. (Actually, I now have a /usr/local/bin, but I think I accidentally created it in my newbie attempts to move wget to that location.)

Eventually, after a lot of clueless poking and searching, I decided that wget belonged in /usr/bin, and wget.1 should go in /usr/share/man/man1. I think it worked–wget responds when I invoke it, and I can get its man pages.

I never did figure out where the global wgetrc file belonged, though. Any ideas? Did Apple switch a lot of directories around when they went to 10.3.9?


Hilary –

You can create those directories if you don’t have them, but it’s largely a matter of convention. Things in /usr/local are normally not part of the operating system and so won’t be overwritten by future releases of the operating system. If you put the binary in /usr/local/bin, you probably want to make sure that /usr/local/bin is also on your PATH just for convenience. But there should be no problem with having it in /usr/bin.

The exception here is the wgetrc. The binary *does* expect to find that in /usr/local/etc, but I don’t think there’s any problem if it’s missing. You can also put it, or variations on it, in your home directory as .wgetrc .


Thanks! I love wget on my *nix boxes. Consequently I missed it on my Mac. I think it used to be included in the developer kit for OS X back in the 10.2 days… why-ever they “replaced” it with curl is beyond me

wget rules!! curl sucks!! thanks for compiling the wget to a binary !! he works great !!

thanks for this pre-built binary! was one of the first results on google

Thanks – you’re a star!

This is great. Thanks for making this wget build for OS X. THis is my first Mac box, the old ones were all Linux systems where I used this all the time. I even had wget for Win32 systems I built for others.

Thanks again!

Just wanted to let everyone know this works in tiger too. I have been missing wget from mac and didn’t know about curl. Just copied to /usr/bin and worked great. Thanks!

Hi there, just wanted to say a big thanks for WGet – fanstastic 😀 (Prashaant, Aucklannd, NZ)

Cheers dude, this saved me digging out the develop tools disc and fighting with ./configure 😉

Great, just stuck it in /usr/bin on 10.4 and good to go…thanks!

I’m almost positive it’s possible to download an HTML and strip it of all HTML tags. Is it wget that can do this or am I thinking of another command that is (someone I know has done this) done after getting the HTML file?

I think you mean ‘lynx’.
If you use it like
lynx URL >dump.txt
you’ll get a text variant of the requested page.

Thanks for the binary. I ran it on Tiger, no problem. Great tool, I’m surprised Apple doesn’t include in their distribution.

Thanks for making the binary available for d/l.

John Rice: the reason wget is not on OS X, is that ‘wget’ is GPL’d; ‘curl’ uses the BSD License.

in all spirit of *nix-ness, you should just build it from darwin ports or fink. 🙂

Yes, but then, I think, it would be rather dependendent on those environments? Is that right? I have limited experience of fink and darwin ports, and while they’re good, I always end up installing quite a lot of stuff beyond the bit I want.

The main reason I rebuild packages like this is so they’ll run on a standard Mac OS X install without the need to install anything else. I’ve done bits of Bacula the same way.

Just wanted to say thanks. I completely avoided having to install FINK on my virgin Tiger install. 🙂

To do what’s called for in the Readme.txt you’ll need to issue these commands:
sudo mkdir /usr/local
sudo mkdir /usr/local/bin
sudo mkdir /usr/local/man
sudo mkdir /usr/local/man/man1
sudo mkdir /usr/local/etc

sudo cp wget /usr/local/bin/
sudo cp wget.1 /usr/local/man/man1/
sudo cp wgetrc /usr/local/etc/

But you could just do this one and it will work:
sudo cp wget /usr/bin

My favorite way to use wget is:
wget -r http://www.helpmedude.com

the -r is recursive and the web address will download to the current directory. use -l 5 to restrict the levels of recursion to 5 (default) or -l 0 for infinite.

NOTE: sudo is geek for “Switch User and DO this task” and since no user is given it switches to the “Super User” named root. It asks you for YOUR password (not root’s) the first time you run sudo. I think it times out in 15 minutes or when you close Terminal, so you may be prompted for your password again.

Thanks for this – it’s looks great, though I have a Terminal-newbie question (which is – hopefully – probably pretty basic).

I copied the files to the proper directories (some existed as hidden directories, and I created the others, via William Janoch’s commands above). But now, when I run Terminal and try Wget, it says it can’t find the command. I can double-click Wget and it launches fine. Is this something with changing the shell? Or somehow telling Terminal that a new command exists?

I also tried installing Wget via Darwin Ports, but – same thing – it can’t run Darwin Ports either after installation.

I am running the latest Tiger release on a G4 Powerbook.


Hi Devyn –

You may need to put the directory containing wget on your path. Typically you’d do this by editing the .bashrc or .bash_profile in your home directory to include something like:

export PATH=$PATH:/usr/local/bin

and then logging out and logging back in again.

Alternatively, you can try running the command by specifying the directory expliticitly:


You may have done this – apologies if this is obvious.

That worked! Thank you!

And no… it was not obvious… (shrugs shoulders… newbie question answered… )

Thank you so much for having this up on the web! You’re instructions were understandable, easy, and best of all, they worked! I wish every unix-related site was this wonderful!

Like a charm, thanks a lot. PS: I almost didn’t survive your captcha check, and trust me I’m a human!

Thanks indeed. Installing a whole package manager like darwin ports or fink just so I could have wget every time I switched computers or reinstalled the OS was getting old. This goes on “The CD.”

Thank you, wget is an essential (Google PageRank love for you)

Another newbie question. I have OSX 10.4 and rarely used terminal. Under my /user/, there was no local folder at all. So I had to create all the folders, as stated in the readme. I tried qsf’s advise, but I probably didn’t understand it very well and made a mistake. Could someone please help me, a person who’s never used terminal? Thanks.

thanks for this, i finaly managed to install wget useing the info here. however i have a question.
wenever i try to use wget, it returns the error
dyld: wget can’t open library: /usr/lib/libssl.0.9.7.dylib (No such file or directory, errno = 2)
Trace/BPT trap

i’m a unix nooblet maximus so i dont realy know what to do now.

any advice would be great.

I was shocked when I tried to use wget on 10.4 and to my astonishment:

patient_zero:~ root# wget blah.com
su: wget: command not found

so I went looking around and I couldn’t believe 10.4 did not come with wget.


Running Tiger on a G4 PB.

All directories exist, and everything was put where it should be, but “man wget” doesn’t work.

Did I miss something?

thanks and well done :^)

Hi Alf –

The place where ‘man’ looks for man pages on Tiger is defined in /usr/share/misc/man.conf (see ‘man man’!).

You might either want to add an extra line ‘MANPATH /usr/local/man’, or copy wget.1 into a /usr/local/share/man/man1 folder, which is on the default path.

And ‘wget -h’ will also give you a pretty good summary of the options.


Thanks Q.

While I love the Mac, my Unix skills aren’t quite there yet. Still sussing out the directory structure (among many other things… :^)

And may I humbly add a “d’oh!” for not having thought of “man man” myself…!


Your 10.3 biuld work on 10.4.6 Intel. Thanks.


I’m running Tiger 10.4.6 and I followed the instructions exactly. I had to create the man1 folder because I have man8 instead. Other than that I followed your instructions and copied all necessary files into the directories stated. It’s still not working. I didn’t modify my path variable but instead cd /usr/local/bin/wget and get “command not found”. Suggestions?
BTW thanks for this. I don’t like curl and want to be able to use wget on my mac.

it works well on macbook, too.
Thanks, Rue

And if you don’t want to use the command line, there’s Deep Vacuum. wget with a graphical interface.

thanks a lot…i am a big user of wget! and you made my macbook pro experience easier

Got Something To Say:

Your email address will not be published. Required fields are marked *

To create code blocks or other preformatted text, indent by four spaces:

    This will be displayed in a monospaced font. The first four 
    spaces will be stripped off, but all other whitespace
    will be preserved.
    Markdown is turned off in code blocks:
     [This is not a link](http://example.com)

To create not a block, but an inline code span, use backticks:

Here is some inline `code`.

For more help see http://daringfireball.net/projects/markdown/syntax


© Copyright Quentin Stafford-Fraser