This may be used to transfer files:
ssh server "cat remote_file" > local_file
The fact that the scp command does exactly the same thing, the usefulness of such a command is questionable,
but imagine that you need to transfer a file of several mega, using a fairly limited bandwidth:
ssh server "gzip -c remote_file" > local_file.gz
Here, the server compress the file, the command used will write the compressed file on the client machine.
It can go even further if you do not want to retrieve a gzipped file, but still want to lower bandwidth usage:
ssh server "gzip -c remote_file " |gunzip > local_file
Using this method effectively sends the compressed tarball to the terminal. You then pipe that into an ssh session which is running the extract version of the previous tar function along with the change directory argument. This, essentially, sends the compressed tarball into a decompression process at the other end over a secure ssh "pipe".
The result is a pretty quick file transfer which - as the data is being sent in a compressed GZIP form (of BZip2 if you replace the z with a j in the tar functions) you save on bandwidth too.
Here an an example of how to do this, assuming you are in (for example) /var/www/html/ and the website you want to transfer is the folder www.example.com.
tar czf - www.example.com/ | ssh firstname.lastname@example.org tar xzf - -C ~/
This will send the entire www.example.com folder over to the home folder on your target server in compressed form over and encrypted connection.
1. - to get from server directory or file
ssh username@from_server "tar czf - directory_to_get" | tar xvzf - -C path_where_to_put
2. - to send directory to another host
tar czf - directory_to_send | ssh username@to_server "tar xzf - -C path_where_to_put"