Frustrating, but hopefully easy Linux <- Windows ftp question
Hello All! I'm trying to move a website from a windows server to a linux server. The windows server is shared environment, the linux is a VPS. So, I'm logged into the VPS and I ftp to my server, I get connected and I basically just want to download all the image files. There are about 20,000 image files in 300 or so subdirectories. I figure I can get them via an mget command. I've triedmget -r /wwwroot/*.jpg
and it doesn't work. Anytime I add the -r option AND a wildcard it doesn't do anything recursively?
I am very new to linux, if anyone has any advice please share. Downloading to my home PC and then uploading is an option, but a very slow one. I would like to do a server to server transfer.
I've also tried yafc, but I get the same problem using it's get command. It will get any .jpg flies in the root directory, but it will not recurse. If I take off the *.jpg and replace it with a directory name it will recurse through that directory without issue.
I may just have to get the entire site, but if I can avoid that I would like to since it will add another GB and then I have to move 20,000 files and I also seem to get errors trying to move that many files.