I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server.
Is there an equivalent way for PostgreSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first?
COPY command is supported in <a href="http://developer.postgresql.org/pgdocs/postgres/protocol-flow.html#PROTOCOL-COPY" rel="nofollow"><strong>PostgreSQL Protocol v3.0</strong></a> (Postgresql 7.4 or newer).
The only thing you need to use
COPY from a remote client is a <strong>libpq</strong> enabled client such as
psql command line utility.
From the remote client run:
$ psql -d dbname -h 192.168.1.1 -U uname < yourbigscript.sqlAnswer2:
You can use the
\copy command from psql tool like:
psql -h IP_REMOTE_POSTGRESQL -d DATABASE -U USER_WITH_RIGHTS -c "\copy TABLE(FIELD_LIST_SEPARATE_BY_COMMA) from 'FILE_IN_CLIENT_MACHINE(MAYBE IN THE SAME DIRECTORY)' with csv header"Answer3:
Assuming you have some sort of client in order to run the query, you can use the COPY FROM STDIN form of the COPY command: <a href="http://www.postgresql.org/docs/current/static/sql-copy.html" rel="nofollow">http://www.postgresql.org/docs/current/static/sql-copy.html</a>Answer4:
\copy command to load data in sql:
$ psql -h <IP> -p <port> -U <username> -d <database> database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' delimiter as '|' database# \copy schema.tablename from '/home/localdir/bulkdir/file.txt' with csv header