Save PL/pgSQL output from PostgreSQL to a CSV file












736














What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?



I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.










share|improve this question




















  • 1




    See also stackoverflow.com/q/1120109/287948
    – Peter Krauss
    Mar 29 '15 at 10:34
















736














What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?



I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.










share|improve this question




















  • 1




    See also stackoverflow.com/q/1120109/287948
    – Peter Krauss
    Mar 29 '15 at 10:34














736












736








736


274





What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?



I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.










share|improve this question















What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?



I'm using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run queries from.







sql postgresql csv postgresql-copy






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Apr 9 '17 at 19:00









Erwin Brandstetter

339k65617791




339k65617791










asked Oct 4 '09 at 22:58









Hoff

17.9k155680




17.9k155680








  • 1




    See also stackoverflow.com/q/1120109/287948
    – Peter Krauss
    Mar 29 '15 at 10:34














  • 1




    See also stackoverflow.com/q/1120109/287948
    – Peter Krauss
    Mar 29 '15 at 10:34








1




1




See also stackoverflow.com/q/1120109/287948
– Peter Krauss
Mar 29 '15 at 10:34




See also stackoverflow.com/q/1120109/287948
– Peter Krauss
Mar 29 '15 at 10:34












16 Answers
16






active

oldest

votes


















1113














Do you want the resulting file on the server, or on the client?



Server side



If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.



Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';


This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.



That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.



The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:




  1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.

  2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…


I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.





Client side



The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.



The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.



The psql command-line client has a special "meta-command" called copy, which takes all the same options as the "real" COPY, but is run inside the client:



copy (Select * From foo) To '/tmp/test.csv' With CSV


Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.



From the docs:




Do not confuse COPY with the psql instruction copy. copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when copy is used.




Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.






share|improve this answer



















  • 107




    Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
    – Drachenfels
    Apr 17 '12 at 17:26








  • 10




    @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
    – krlmlr
    Feb 13 '13 at 10:12






  • 2




    @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
    – jO.
    Nov 12 '13 at 21:24






  • 11




    It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
    – isaaclw
    Jan 17 '14 at 13:49






  • 1




    @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
    – IMSoP
    May 2 '18 at 17:49



















432














There are several solutions:



1 psql command



psql -d dbname -t -A -F"," -c "select * from users" > output.csv



This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get



2 postgres copy command



COPY (SELECT * from users) To '/tmp/output.csv' With CSV;



3 psql interactive (or not)



>psql dbname
psql>f ','
psql>a
psql>o '/tmp/output.csv'
psql>SELECT * from users;
psql>q


All of them can be used in scripts, but I prefer #1.



4 pgadmin but that's not scriptable.






share|improve this answer



















  • 27




    IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
    – Piohen
    May 6 '13 at 21:07










  • @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
    – sorin
    May 7 '13 at 9:09






  • 4




    Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
    – Cerin
    Apr 8 '14 at 21:39






  • 5




    @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
    – ic3b3rg
    Jun 5 '14 at 21:40








  • 20




    Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
    – MrColes
    Sep 17 '14 at 21:07



















79














In terminal (while connected to the db) set output to the cvs file



1) Set field seperator to ',':



f ','


2) Set output format unaligned:



a


3) Show only tuples:



t


4) Set output:



o '/tmp/yourOutputFile.csv'


5) Execute your query:



:select * from YOUR_TABLE


6) Output:



o


You will then be able to find your csv file in this location:



cd /tmp


Copy it using the scp command or edit using nano:



nano /tmp/yourOutputFile.csv





share|improve this answer



















  • 4




    and o in order to print console again
    – metdos
    Aug 6 '12 at 14:57






  • 1




    This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
    – Ruslan Kabalin
    Nov 29 '12 at 16:39










  • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
    – Marcin Wasiluk
    Nov 30 '12 at 11:01






  • 5




    I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
    – Danny Armstrong
    Feb 6 '14 at 23:50










  • What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
    – Wildcard
    Jan 7 '17 at 4:19





















32














If you're interested in all the columns of a particular table along with headers, you can use



COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


This is a tiny bit simpler than



COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


which, to the best of my knowledge, are equivalent.






share|improve this answer

















  • 1




    If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
    – Devy
    Nov 13 '13 at 21:58





















20














I had to use the COPY because I received the error message:



ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied


So I used:



Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;


and it is functioning






share|improve this answer































    16














    psql can do this for you:



    edd@ron:~$ psql -d beancounter -t -A -F"," 
    -c "select date, symbol, day_close "
    "from stockprices where symbol like 'I%' "
    "and date >= '2009-10-02'"
    2009-10-02,IBM,119.02
    2009-10-02,IEF,92.77
    2009-10-02,IEV,37.05
    2009-10-02,IJH,66.18
    2009-10-02,IJR,50.33
    2009-10-02,ILF,42.24
    2009-10-02,INTC,18.97
    2009-10-02,IP,21.39
    edd@ron:~$


    See man psql for help on the options used here.






    share|improve this answer

















    • 12




      This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
      – Greg Smith
      Oct 6 '09 at 5:19



















    13














    CSV Export Unification



    This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.



    Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:



    COPY (select id, name from groups) TO STDOUT WITH CSV HEADER


    Remember just one command!



    It's great for use over ssh:



    $ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv


    It's great for use inside docker over ssh:



    $ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


    It's even great on the local machine:



    $ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


    Or inside docker on the local machine?:



    docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


    Or on a kubernetes cluster, in docker, over HTTPS??:



    kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


    So versatile, much commas!



    Do you even?



    Yes I did, here are my notes:



    The COPYses



    Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.



    COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.



    Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.



    It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.



    PSQL Parameters



    With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:



    $ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
    2,Technician,Test 2,,,t,,0,,
    3,Truck,1,2017-10-02,,t,,0,,
    4,Truck,2,2017-10-02,,t,,0,,


    Other Tools



    No, I just want to get CSV out of my server without compiling and/or installing a tool.






    share|improve this answer























    • Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
      – kRazzy R
      Apr 25 '18 at 17:00












    • @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
      – joshperry
      Apr 26 '18 at 2:02





















    11














    In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.






    share|improve this answer





















    • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
      – Mike
      Jan 31 '12 at 22:08





















    10














    I'm working on AWS Redshift, which does not support the COPY TO feature.



    My BI tool supports tab-delimited CSVs though, so I used the following:



     psql -h  dblocation  -p port -U user  -d dbname  -F $'t' --no-align -c " SELECT *   FROM TABLE" > outfile.csv





    share|improve this answer































      6














      I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.



      psql2csv [OPTIONS] < QUERY
      psql2csv [OPTIONS] QUERY


      The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:



      -h, --help           show help, then exit
      --encoding=ENCODING use a different encoding than UTF8 (Excel likes LATIN1)
      --no-header do not output a header





      share|improve this answer





















      • Works great. Thank you.
        – AlexM
        Nov 3 '17 at 6:52



















      5














      If you have longer query and you like to use psql then put your query to a file and use the following command:



      psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv





      share|improve this answer





















      • FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
        – CFL_Jeff
        May 31 '18 at 19:44



















      5














      I tried several things but few of them were able to give me the desired CSV with header details.



      Here is what worked for me.



      psql -d dbame -U username 
      -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " >
      OUTPUT_CSV_FILE.csv





      share|improve this answer































        3














        New version - psql 12 - will support --csv.




        psql - devel



        --csv



        Switches to CSV (Comma-Separated Values) output mode. This is equivalent to pset format csv.





        csv_fieldsep



        Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.




        Usage:



        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres

        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv





        share|improve this answer





























          1














          JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.



          It lets you connect to remote databases and run SQL queries on them.



                                                                                                                                                                 Sourcejackdb-heroku http://static.jackdb.com/assets/img/blog/jackdb-heroku-oauth-connect.gif





          Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).





          jackdb-export



          Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.






          share|improve this answer































            0














            To Download CSV file with column names as HEADER use this command:



            Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;





            share|improve this answer





























              -3














              import json
              cursor = conn.cursor()
              qry = """ SELECT details FROM test_csvfile """
              cursor.execute(qry)
              rows = cursor.fetchall()

              value = json.dumps(rows)

              with open("/home/asha/Desktop/Income_output.json","w+") as f:
              f.write(value)
              print 'Saved to File Successfully'





              share|improve this answer

















              • 3




                Please expolain what you did editing answer, avoid code only answer
                – GGO
                Feb 27 '18 at 12:09






              • 3




                Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                – Toby Speight
                Feb 27 '18 at 12:48






              • 2




                This will produce a json file, not a csv file.
                – nvoigt
                Feb 27 '18 at 13:23











              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f1517635%2fsave-pl-pgsql-output-from-postgresql-to-a-csv-file%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              16 Answers
              16






              active

              oldest

              votes








              16 Answers
              16






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              1113














              Do you want the resulting file on the server, or on the client?



              Server side



              If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.



              Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';


              This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.



              That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.



              The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:




              1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.

              2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…


              I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.





              Client side



              The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.



              The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.



              The psql command-line client has a special "meta-command" called copy, which takes all the same options as the "real" COPY, but is run inside the client:



              copy (Select * From foo) To '/tmp/test.csv' With CSV


              Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.



              From the docs:




              Do not confuse COPY with the psql instruction copy. copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when copy is used.




              Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.






              share|improve this answer



















              • 107




                Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
                – Drachenfels
                Apr 17 '12 at 17:26








              • 10




                @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
                – krlmlr
                Feb 13 '13 at 10:12






              • 2




                @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
                – jO.
                Nov 12 '13 at 21:24






              • 11




                It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
                – isaaclw
                Jan 17 '14 at 13:49






              • 1




                @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
                – IMSoP
                May 2 '18 at 17:49
















              1113














              Do you want the resulting file on the server, or on the client?



              Server side



              If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.



              Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';


              This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.



              That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.



              The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:




              1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.

              2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…


              I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.





              Client side



              The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.



              The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.



              The psql command-line client has a special "meta-command" called copy, which takes all the same options as the "real" COPY, but is run inside the client:



              copy (Select * From foo) To '/tmp/test.csv' With CSV


              Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.



              From the docs:




              Do not confuse COPY with the psql instruction copy. copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when copy is used.




              Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.






              share|improve this answer



















              • 107




                Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
                – Drachenfels
                Apr 17 '12 at 17:26








              • 10




                @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
                – krlmlr
                Feb 13 '13 at 10:12






              • 2




                @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
                – jO.
                Nov 12 '13 at 21:24






              • 11




                It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
                – isaaclw
                Jan 17 '14 at 13:49






              • 1




                @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
                – IMSoP
                May 2 '18 at 17:49














              1113












              1113








              1113






              Do you want the resulting file on the server, or on the client?



              Server side



              If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.



              Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';


              This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.



              That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.



              The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:




              1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.

              2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…


              I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.





              Client side



              The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.



              The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.



              The psql command-line client has a special "meta-command" called copy, which takes all the same options as the "real" COPY, but is run inside the client:



              copy (Select * From foo) To '/tmp/test.csv' With CSV


              Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.



              From the docs:




              Do not confuse COPY with the psql instruction copy. copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when copy is used.




              Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.






              share|improve this answer














              Do you want the resulting file on the server, or on the client?



              Server side



              If you want something easy to re-use or automate, you can use Postgresql's built in COPY command. e.g.



              Copy (Select * From foo) To '/tmp/test.csv' With CSV DELIMITER ',';


              This approach runs entirely on the remote server - it can't write to your local PC. It also needs to be run as a Postgres "superuser" (normally called "root") because Postgres can't stop it doing nasty things with that machine's local filesystem.



              That doesn't actually mean you have to be connected as a superuser (automating that would be a security risk of a different kind), because you can use the SECURITY DEFINER option to CREATE FUNCTION to make a function which runs as though you were a superuser.



              The crucial part is that your function is there to perform additional checks, not just by-pass the security - so you could write a function which exports the exact data you need, or you could write something which can accept various options as long as they meet a strict whitelist. You need to check two things:




              1. Which files should the user be allowed to read/write on disk? This might be a particular directory, for instance, and the filename might have to have a suitable prefix or extension.

              2. Which tables should the user be able to read/write in the database? This would normally be defined by GRANTs in the database, but the function is now running as a superuser, so tables which would normally be "out of bounds" will be fully accessible. You probably don’t want to let someone invoke your function and add rows on the end of your “users” table…


              I've written a blog post expanding on this approach, including some examples of functions that export (or import) files and tables meeting strict conditions.





              Client side



              The other approach is to do the file handling on the client side, i.e. in your application or script. The Postgres server doesn't need to know what file you're copying to, it just spits out the data and the client puts it somewhere.



              The underlying syntax for this is the COPY TO STDOUT command, and graphical tools like pgAdmin will wrap it for you in a nice dialog.



              The psql command-line client has a special "meta-command" called copy, which takes all the same options as the "real" COPY, but is run inside the client:



              copy (Select * From foo) To '/tmp/test.csv' With CSV


              Note that there is no terminating ;, because meta-commands are terminated by newline, unlike SQL commands.



              From the docs:




              Do not confuse COPY with the psql instruction copy. copy invokes COPY FROM STDIN or COPY TO STDOUT, and then fetches/stores the data in a file accessible to the psql client. Thus, file accessibility and access rights depend on the client rather than the server when copy is used.




              Your application programming language may also have support for pushing or fetching the data, but you cannot generally use COPY FROM STDIN/TO STDOUT within a standard SQL statement, because there is no way of connecting the input/output stream. PHP's PostgreSQL handler (not PDO) includes very basic pg_copy_from and pg_copy_to functions which copy to/from a PHP array, which may not be efficient for large data sets.







              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited May 14 '16 at 12:49









              Community

              11




              11










              answered Oct 4 '09 at 23:18









              IMSoP

              45.7k65693




              45.7k65693








              • 107




                Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
                – Drachenfels
                Apr 17 '12 at 17:26








              • 10




                @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
                – krlmlr
                Feb 13 '13 at 10:12






              • 2




                @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
                – jO.
                Nov 12 '13 at 21:24






              • 11




                It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
                – isaaclw
                Jan 17 '14 at 13:49






              • 1




                @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
                – IMSoP
                May 2 '18 at 17:49














              • 107




                Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
                – Drachenfels
                Apr 17 '12 at 17:26








              • 10




                @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
                – krlmlr
                Feb 13 '13 at 10:12






              • 2




                @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
                – jO.
                Nov 12 '13 at 21:24






              • 11




                It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
                – isaaclw
                Jan 17 '14 at 13:49






              • 1




                @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
                – IMSoP
                May 2 '18 at 17:49








              107




              107




              Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
              – Drachenfels
              Apr 17 '12 at 17:26






              Obviously above example requires sometimes user to be a superuser, here's a version for ordinary people ;) echo “COPY (SELECT * from foo) TO STDOUT with CSV HEADER” | psql -o '/tmp/test.csv' database_name
              – Drachenfels
              Apr 17 '12 at 17:26






              10




              10




              @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
              – krlmlr
              Feb 13 '13 at 10:12




              @Drachenfels: copy works, too -- there, the paths are relative to the client, and no semicolon is needed/allowed. See my edit.
              – krlmlr
              Feb 13 '13 at 10:12




              2




              2




              @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
              – jO.
              Nov 12 '13 at 21:24




              @IMSoP: How would you add a COPY statement to an sql (on postgres 9.3) function? So the query gets saved to a .csv file?
              – jO.
              Nov 12 '13 at 21:24




              11




              11




              It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
              – isaaclw
              Jan 17 '14 at 13:49




              It looks like copy needs to be a one-liner. So you don't get the beauty of formatting the sql the way you want, and just putting a copy/function around it.
              – isaaclw
              Jan 17 '14 at 13:49




              1




              1




              @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
              – IMSoP
              May 2 '18 at 17:49




              @AndreSilva As the answer states, copy is a special meta-command in the psql command-line client. It won't work in other clients, like pgAdmin; they will probably have their own tools, such as graphical wizards, for doing this job.
              – IMSoP
              May 2 '18 at 17:49













              432














              There are several solutions:



              1 psql command



              psql -d dbname -t -A -F"," -c "select * from users" > output.csv



              This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get



              2 postgres copy command



              COPY (SELECT * from users) To '/tmp/output.csv' With CSV;



              3 psql interactive (or not)



              >psql dbname
              psql>f ','
              psql>a
              psql>o '/tmp/output.csv'
              psql>SELECT * from users;
              psql>q


              All of them can be used in scripts, but I prefer #1.



              4 pgadmin but that's not scriptable.






              share|improve this answer



















              • 27




                IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
                – Piohen
                May 6 '13 at 21:07










              • @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
                – sorin
                May 7 '13 at 9:09






              • 4




                Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
                – Cerin
                Apr 8 '14 at 21:39






              • 5




                @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
                – ic3b3rg
                Jun 5 '14 at 21:40








              • 20




                Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
                – MrColes
                Sep 17 '14 at 21:07
















              432














              There are several solutions:



              1 psql command



              psql -d dbname -t -A -F"," -c "select * from users" > output.csv



              This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get



              2 postgres copy command



              COPY (SELECT * from users) To '/tmp/output.csv' With CSV;



              3 psql interactive (or not)



              >psql dbname
              psql>f ','
              psql>a
              psql>o '/tmp/output.csv'
              psql>SELECT * from users;
              psql>q


              All of them can be used in scripts, but I prefer #1.



              4 pgadmin but that's not scriptable.






              share|improve this answer



















              • 27




                IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
                – Piohen
                May 6 '13 at 21:07










              • @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
                – sorin
                May 7 '13 at 9:09






              • 4




                Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
                – Cerin
                Apr 8 '14 at 21:39






              • 5




                @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
                – ic3b3rg
                Jun 5 '14 at 21:40








              • 20




                Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
                – MrColes
                Sep 17 '14 at 21:07














              432












              432








              432






              There are several solutions:



              1 psql command



              psql -d dbname -t -A -F"," -c "select * from users" > output.csv



              This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get



              2 postgres copy command



              COPY (SELECT * from users) To '/tmp/output.csv' With CSV;



              3 psql interactive (or not)



              >psql dbname
              psql>f ','
              psql>a
              psql>o '/tmp/output.csv'
              psql>SELECT * from users;
              psql>q


              All of them can be used in scripts, but I prefer #1.



              4 pgadmin but that's not scriptable.






              share|improve this answer














              There are several solutions:



              1 psql command



              psql -d dbname -t -A -F"," -c "select * from users" > output.csv



              This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get



              2 postgres copy command



              COPY (SELECT * from users) To '/tmp/output.csv' With CSV;



              3 psql interactive (or not)



              >psql dbname
              psql>f ','
              psql>a
              psql>o '/tmp/output.csv'
              psql>SELECT * from users;
              psql>q


              All of them can be used in scripts, but I prefer #1.



              4 pgadmin but that's not scriptable.







              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited May 19 '15 at 4:39









              MrValdez

              6,36394976




              6,36394976










              answered Aug 8 '12 at 17:56









              sorin

              73.7k114365573




              73.7k114365573








              • 27




                IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
                – Piohen
                May 6 '13 at 21:07










              • @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
                – sorin
                May 7 '13 at 9:09






              • 4




                Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
                – Cerin
                Apr 8 '14 at 21:39






              • 5




                @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
                – ic3b3rg
                Jun 5 '14 at 21:40








              • 20




                Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
                – MrColes
                Sep 17 '14 at 21:07














              • 27




                IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
                – Piohen
                May 6 '13 at 21:07










              • @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
                – sorin
                May 7 '13 at 9:09






              • 4




                Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
                – Cerin
                Apr 8 '14 at 21:39






              • 5




                @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
                – ic3b3rg
                Jun 5 '14 at 21:40








              • 20




                Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
                – MrColes
                Sep 17 '14 at 21:07








              27




              27




              IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
              – Piohen
              May 6 '13 at 21:07




              IMHO the first option is error prone, because it doesn't include proper escaping of comma in exported data.
              – Piohen
              May 6 '13 at 21:07












              @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
              – sorin
              May 7 '13 at 9:09




              @Piohen as far as I remember it does because it will quote strings, but I'm not 100% sure, better to test.
              – sorin
              May 7 '13 at 9:09




              4




              4




              Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
              – Cerin
              Apr 8 '14 at 21:39




              Also, psql doesn't quote cell values, so if ANY of your data uses the delimiter, your file will be corrupted.
              – Cerin
              Apr 8 '14 at 21:39




              5




              5




              @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
              – ic3b3rg
              Jun 5 '14 at 21:40






              @Cerin -t is a synonym for --tuples-only (turn off printing of column names and result row count footers, etc.) - omit it to get column headers
              – ic3b3rg
              Jun 5 '14 at 21:40






              20




              20




              Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
              – MrColes
              Sep 17 '14 at 21:07




              Just tested the comma-escaping claim—it’s true, method #1 does not escape commas in values.
              – MrColes
              Sep 17 '14 at 21:07











              79














              In terminal (while connected to the db) set output to the cvs file



              1) Set field seperator to ',':



              f ','


              2) Set output format unaligned:



              a


              3) Show only tuples:



              t


              4) Set output:



              o '/tmp/yourOutputFile.csv'


              5) Execute your query:



              :select * from YOUR_TABLE


              6) Output:



              o


              You will then be able to find your csv file in this location:



              cd /tmp


              Copy it using the scp command or edit using nano:



              nano /tmp/yourOutputFile.csv





              share|improve this answer



















              • 4




                and o in order to print console again
                – metdos
                Aug 6 '12 at 14:57






              • 1




                This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
                – Ruslan Kabalin
                Nov 29 '12 at 16:39










              • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
                – Marcin Wasiluk
                Nov 30 '12 at 11:01






              • 5




                I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
                – Danny Armstrong
                Feb 6 '14 at 23:50










              • What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
                – Wildcard
                Jan 7 '17 at 4:19


















              79














              In terminal (while connected to the db) set output to the cvs file



              1) Set field seperator to ',':



              f ','


              2) Set output format unaligned:



              a


              3) Show only tuples:



              t


              4) Set output:



              o '/tmp/yourOutputFile.csv'


              5) Execute your query:



              :select * from YOUR_TABLE


              6) Output:



              o


              You will then be able to find your csv file in this location:



              cd /tmp


              Copy it using the scp command or edit using nano:



              nano /tmp/yourOutputFile.csv





              share|improve this answer



















              • 4




                and o in order to print console again
                – metdos
                Aug 6 '12 at 14:57






              • 1




                This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
                – Ruslan Kabalin
                Nov 29 '12 at 16:39










              • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
                – Marcin Wasiluk
                Nov 30 '12 at 11:01






              • 5




                I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
                – Danny Armstrong
                Feb 6 '14 at 23:50










              • What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
                – Wildcard
                Jan 7 '17 at 4:19
















              79












              79








              79






              In terminal (while connected to the db) set output to the cvs file



              1) Set field seperator to ',':



              f ','


              2) Set output format unaligned:



              a


              3) Show only tuples:



              t


              4) Set output:



              o '/tmp/yourOutputFile.csv'


              5) Execute your query:



              :select * from YOUR_TABLE


              6) Output:



              o


              You will then be able to find your csv file in this location:



              cd /tmp


              Copy it using the scp command or edit using nano:



              nano /tmp/yourOutputFile.csv





              share|improve this answer














              In terminal (while connected to the db) set output to the cvs file



              1) Set field seperator to ',':



              f ','


              2) Set output format unaligned:



              a


              3) Show only tuples:



              t


              4) Set output:



              o '/tmp/yourOutputFile.csv'


              5) Execute your query:



              :select * from YOUR_TABLE


              6) Output:



              o


              You will then be able to find your csv file in this location:



              cd /tmp


              Copy it using the scp command or edit using nano:



              nano /tmp/yourOutputFile.csv






              share|improve this answer














              share|improve this answer



              share|improve this answer








              edited Mar 16 '17 at 12:49









              yunque

              143214




              143214










              answered Jun 11 '12 at 11:18









              Marcin Wasiluk

              3,28722138




              3,28722138








              • 4




                and o in order to print console again
                – metdos
                Aug 6 '12 at 14:57






              • 1




                This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
                – Ruslan Kabalin
                Nov 29 '12 at 16:39










              • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
                – Marcin Wasiluk
                Nov 30 '12 at 11:01






              • 5




                I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
                – Danny Armstrong
                Feb 6 '14 at 23:50










              • What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
                – Wildcard
                Jan 7 '17 at 4:19
















              • 4




                and o in order to print console again
                – metdos
                Aug 6 '12 at 14:57






              • 1




                This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
                – Ruslan Kabalin
                Nov 29 '12 at 16:39










              • @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
                – Marcin Wasiluk
                Nov 30 '12 at 11:01






              • 5




                I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
                – Danny Armstrong
                Feb 6 '14 at 23:50










              • What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
                – Wildcard
                Jan 7 '17 at 4:19










              4




              4




              and o in order to print console again
              – metdos
              Aug 6 '12 at 14:57




              and o in order to print console again
              – metdos
              Aug 6 '12 at 14:57




              1




              1




              This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
              – Ruslan Kabalin
              Nov 29 '12 at 16:39




              This will not produce a CSV file, it will just record the command output to the text file (which does not make it the comma-separated).
              – Ruslan Kabalin
              Nov 29 '12 at 16:39












              @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
              – Marcin Wasiluk
              Nov 30 '12 at 11:01




              @RuslanKabalin yes I have just notticed that and ammended instruction to create comma-separated output (cvs)
              – Marcin Wasiluk
              Nov 30 '12 at 11:01




              5




              5




              I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
              – Danny Armstrong
              Feb 6 '14 at 23:50




              I'd improve this answer by noting that the "csv" output will not be properly escaped and each time a sql command is executed the results are concatenated to the output file.
              – Danny Armstrong
              Feb 6 '14 at 23:50












              What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
              – Wildcard
              Jan 7 '17 at 4:19






              What about newlines in field values? The COPY or copy approaches handle correctly (convert to standard CSV format); does this?
              – Wildcard
              Jan 7 '17 at 4:19













              32














              If you're interested in all the columns of a particular table along with headers, you can use



              COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              This is a tiny bit simpler than



              COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              which, to the best of my knowledge, are equivalent.






              share|improve this answer

















              • 1




                If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
                – Devy
                Nov 13 '13 at 21:58


















              32














              If you're interested in all the columns of a particular table along with headers, you can use



              COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              This is a tiny bit simpler than



              COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              which, to the best of my knowledge, are equivalent.






              share|improve this answer

















              • 1




                If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
                – Devy
                Nov 13 '13 at 21:58
















              32












              32








              32






              If you're interested in all the columns of a particular table along with headers, you can use



              COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              This is a tiny bit simpler than



              COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              which, to the best of my knowledge, are equivalent.






              share|improve this answer












              If you're interested in all the columns of a particular table along with headers, you can use



              COPY table TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              This is a tiny bit simpler than



              COPY (SELECT * FROM table) TO '/some_destdir/mycsv.csv' WITH CSV HEADER;


              which, to the best of my knowledge, are equivalent.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Jan 11 '13 at 20:34









              benjwadams

              860813




              860813








              • 1




                If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
                – Devy
                Nov 13 '13 at 21:58
















              • 1




                If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
                – Devy
                Nov 13 '13 at 21:58










              1




              1




              If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
              – Devy
              Nov 13 '13 at 21:58






              If the query is custom (I.E. having column aliases or joining different tables), the header will print out the column aliases just as it display on the screen.
              – Devy
              Nov 13 '13 at 21:58













              20














              I had to use the COPY because I received the error message:



              ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied


              So I used:



              Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;


              and it is functioning






              share|improve this answer




























                20














                I had to use the COPY because I received the error message:



                ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied


                So I used:



                Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;


                and it is functioning






                share|improve this answer


























                  20












                  20








                  20






                  I had to use the COPY because I received the error message:



                  ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied


                  So I used:



                  Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;


                  and it is functioning






                  share|improve this answer














                  I had to use the COPY because I received the error message:



                  ERROR:  could not open file "/filepath/places.csv" for writing: Permission denied


                  So I used:



                  Copy (Select address, zip  From manjadata) To '/filepath/places.csv' With CSV;


                  and it is functioning







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited May 26 '15 at 13:19

























                  answered Jun 28 '14 at 21:41









                  maudulus

                  4,62654177




                  4,62654177























                      16














                      psql can do this for you:



                      edd@ron:~$ psql -d beancounter -t -A -F"," 
                      -c "select date, symbol, day_close "
                      "from stockprices where symbol like 'I%' "
                      "and date >= '2009-10-02'"
                      2009-10-02,IBM,119.02
                      2009-10-02,IEF,92.77
                      2009-10-02,IEV,37.05
                      2009-10-02,IJH,66.18
                      2009-10-02,IJR,50.33
                      2009-10-02,ILF,42.24
                      2009-10-02,INTC,18.97
                      2009-10-02,IP,21.39
                      edd@ron:~$


                      See man psql for help on the options used here.






                      share|improve this answer

















                      • 12




                        This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                        – Greg Smith
                        Oct 6 '09 at 5:19
















                      16














                      psql can do this for you:



                      edd@ron:~$ psql -d beancounter -t -A -F"," 
                      -c "select date, symbol, day_close "
                      "from stockprices where symbol like 'I%' "
                      "and date >= '2009-10-02'"
                      2009-10-02,IBM,119.02
                      2009-10-02,IEF,92.77
                      2009-10-02,IEV,37.05
                      2009-10-02,IJH,66.18
                      2009-10-02,IJR,50.33
                      2009-10-02,ILF,42.24
                      2009-10-02,INTC,18.97
                      2009-10-02,IP,21.39
                      edd@ron:~$


                      See man psql for help on the options used here.






                      share|improve this answer

















                      • 12




                        This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                        – Greg Smith
                        Oct 6 '09 at 5:19














                      16












                      16








                      16






                      psql can do this for you:



                      edd@ron:~$ psql -d beancounter -t -A -F"," 
                      -c "select date, symbol, day_close "
                      "from stockprices where symbol like 'I%' "
                      "and date >= '2009-10-02'"
                      2009-10-02,IBM,119.02
                      2009-10-02,IEF,92.77
                      2009-10-02,IEV,37.05
                      2009-10-02,IJH,66.18
                      2009-10-02,IJR,50.33
                      2009-10-02,ILF,42.24
                      2009-10-02,INTC,18.97
                      2009-10-02,IP,21.39
                      edd@ron:~$


                      See man psql for help on the options used here.






                      share|improve this answer












                      psql can do this for you:



                      edd@ron:~$ psql -d beancounter -t -A -F"," 
                      -c "select date, symbol, day_close "
                      "from stockprices where symbol like 'I%' "
                      "and date >= '2009-10-02'"
                      2009-10-02,IBM,119.02
                      2009-10-02,IEF,92.77
                      2009-10-02,IEV,37.05
                      2009-10-02,IJH,66.18
                      2009-10-02,IJR,50.33
                      2009-10-02,ILF,42.24
                      2009-10-02,INTC,18.97
                      2009-10-02,IP,21.39
                      edd@ron:~$


                      See man psql for help on the options used here.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Oct 4 '09 at 23:12









                      Dirk Eddelbuettel

                      276k37509601




                      276k37509601








                      • 12




                        This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                        – Greg Smith
                        Oct 6 '09 at 5:19














                      • 12




                        This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                        – Greg Smith
                        Oct 6 '09 at 5:19








                      12




                      12




                      This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                      – Greg Smith
                      Oct 6 '09 at 5:19




                      This isn't a true CSV file--watch it burn if there are commas in the data--so using the built-in COPY support is preferred. But this general technique is handy as a quick hack for exporting from Postgres in other delimited formats besides CSV.
                      – Greg Smith
                      Oct 6 '09 at 5:19











                      13














                      CSV Export Unification



                      This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.



                      Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:



                      COPY (select id, name from groups) TO STDOUT WITH CSV HEADER


                      Remember just one command!



                      It's great for use over ssh:



                      $ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's great for use inside docker over ssh:



                      $ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's even great on the local machine:



                      $ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or inside docker on the local machine?:



                      docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or on a kubernetes cluster, in docker, over HTTPS??:



                      kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      So versatile, much commas!



                      Do you even?



                      Yes I did, here are my notes:



                      The COPYses



                      Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.



                      COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.



                      Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.



                      It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.



                      PSQL Parameters



                      With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:



                      $ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
                      2,Technician,Test 2,,,t,,0,,
                      3,Truck,1,2017-10-02,,t,,0,,
                      4,Truck,2,2017-10-02,,t,,0,,


                      Other Tools



                      No, I just want to get CSV out of my server without compiling and/or installing a tool.






                      share|improve this answer























                      • Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                        – kRazzy R
                        Apr 25 '18 at 17:00












                      • @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                        – joshperry
                        Apr 26 '18 at 2:02


















                      13














                      CSV Export Unification



                      This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.



                      Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:



                      COPY (select id, name from groups) TO STDOUT WITH CSV HEADER


                      Remember just one command!



                      It's great for use over ssh:



                      $ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's great for use inside docker over ssh:



                      $ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's even great on the local machine:



                      $ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or inside docker on the local machine?:



                      docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or on a kubernetes cluster, in docker, over HTTPS??:



                      kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      So versatile, much commas!



                      Do you even?



                      Yes I did, here are my notes:



                      The COPYses



                      Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.



                      COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.



                      Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.



                      It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.



                      PSQL Parameters



                      With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:



                      $ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
                      2,Technician,Test 2,,,t,,0,,
                      3,Truck,1,2017-10-02,,t,,0,,
                      4,Truck,2,2017-10-02,,t,,0,,


                      Other Tools



                      No, I just want to get CSV out of my server without compiling and/or installing a tool.






                      share|improve this answer























                      • Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                        – kRazzy R
                        Apr 25 '18 at 17:00












                      • @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                        – joshperry
                        Apr 26 '18 at 2:02
















                      13












                      13








                      13






                      CSV Export Unification



                      This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.



                      Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:



                      COPY (select id, name from groups) TO STDOUT WITH CSV HEADER


                      Remember just one command!



                      It's great for use over ssh:



                      $ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's great for use inside docker over ssh:



                      $ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's even great on the local machine:



                      $ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or inside docker on the local machine?:



                      docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or on a kubernetes cluster, in docker, over HTTPS??:



                      kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      So versatile, much commas!



                      Do you even?



                      Yes I did, here are my notes:



                      The COPYses



                      Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.



                      COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.



                      Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.



                      It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.



                      PSQL Parameters



                      With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:



                      $ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
                      2,Technician,Test 2,,,t,,0,,
                      3,Truck,1,2017-10-02,,t,,0,,
                      4,Truck,2,2017-10-02,,t,,0,,


                      Other Tools



                      No, I just want to get CSV out of my server without compiling and/or installing a tool.






                      share|improve this answer














                      CSV Export Unification



                      This information isn't really well represented. As this is the second time I've needed to derive this, I'll put this here to remind myself if nothing else.



                      Really the best way to do this (get CSV out of postgres) is to use the COPY ... TO STDOUT command. Though you don't want to do it the way shown in the answers here. The correct way to use the command is:



                      COPY (select id, name from groups) TO STDOUT WITH CSV HEADER


                      Remember just one command!



                      It's great for use over ssh:



                      $ ssh psqlserver.example.com 'psql -d mydb "COPY (select id, name from groups) TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's great for use inside docker over ssh:



                      $ ssh pgserver.example.com 'docker exec -tu postgres postgres psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      It's even great on the local machine:



                      $ psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or inside docker on the local machine?:



                      docker exec -tu postgres postgres psql -d mydb -c 'COPY groups TO STDOUT WITH CSV HEADER' > groups.csv


                      Or on a kubernetes cluster, in docker, over HTTPS??:



                      kubectl exec -t postgres-2592991581-ws2td 'psql -d mydb -c "COPY groups TO STDOUT WITH CSV HEADER"' > groups.csv


                      So versatile, much commas!



                      Do you even?



                      Yes I did, here are my notes:



                      The COPYses



                      Using /copy effectively executes file operations on whatever system the psql command is running on, as the user who is executing it1. If you connect to a remote server, it's simple to copy data files on the system executing psql to/from the remote server.



                      COPY executes file operations on the server as the backend process user account (default postgres), file paths and permissions are checked and applied accordingly. If using TO STDOUT then file permissions checks are bypassed.



                      Both of these options require subsequent file movement if psql is not executing on the system where you want the resultant CSV to ultimately reside. This is the most likely case, in my experience, when you mostly work with remote servers.



                      It is more complex to configure something like a TCP/IP tunnel over ssh to a remote system for simple CSV output, but for other output formats (binary) it may be better to /copy over a tunneled connection, executing a local psql. In a similar vein, for large imports, moving the source file to the server and using COPY is probably the highest-performance option.



                      PSQL Parameters



                      With psql parameters you can format the output like CSV but there are downsides like having to remember to disable the pager and not getting headers:



                      $ psql -P pager=off -d mydb -t -A -F',' -c 'select * from groups;'
                      2,Technician,Test 2,,,t,,0,,
                      3,Truck,1,2017-10-02,,t,,0,,
                      4,Truck,2,2017-10-02,,t,,0,,


                      Other Tools



                      No, I just want to get CSV out of my server without compiling and/or installing a tool.







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited Oct 30 '18 at 20:29

























                      answered Apr 24 '18 at 1:17









                      joshperry

                      31.6k127494




                      31.6k127494












                      • Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                        – kRazzy R
                        Apr 25 '18 at 17:00












                      • @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                        – joshperry
                        Apr 26 '18 at 2:02




















                      • Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                        – kRazzy R
                        Apr 25 '18 at 17:00












                      • @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                        – joshperry
                        Apr 26 '18 at 2:02


















                      Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                      – kRazzy R
                      Apr 25 '18 at 17:00






                      Where do the results get saved to ? My query runs but the file doesn't show up anywhere on my computer. This is what I'm doing : COPY (select a,b from c where d = '1') TO STDOUT WITH CSVHEADER > abcd.csv
                      – kRazzy R
                      Apr 25 '18 at 17:00














                      @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                      – joshperry
                      Apr 26 '18 at 2:02






                      @kRazzyR The output goes to stdout of the psql command, so ultimately whatever you do with stdout is where the data goes. In my examples I use '> file.csv' to redirect to a file. You want to make sure that is outside the command being sent to to the server through the psql -c parameter. See the 'local machine' example.
                      – joshperry
                      Apr 26 '18 at 2:02













                      11














                      In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.






                      share|improve this answer





















                      • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                        – Mike
                        Jan 31 '12 at 22:08


















                      11














                      In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.






                      share|improve this answer





















                      • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                        – Mike
                        Jan 31 '12 at 22:08
















                      11












                      11








                      11






                      In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.






                      share|improve this answer












                      In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Nov 4 '09 at 16:58









                      Amanda Nyren

                      37716




                      37716












                      • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                        – Mike
                        Jan 31 '12 at 22:08




















                      • IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                        – Mike
                        Jan 31 '12 at 22:08


















                      IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                      – Mike
                      Jan 31 '12 at 22:08






                      IMSoP's answer didn't work for me as I needed to be a super admin. This worked a treat. Thanks!
                      – Mike
                      Jan 31 '12 at 22:08













                      10














                      I'm working on AWS Redshift, which does not support the COPY TO feature.



                      My BI tool supports tab-delimited CSVs though, so I used the following:



                       psql -h  dblocation  -p port -U user  -d dbname  -F $'t' --no-align -c " SELECT *   FROM TABLE" > outfile.csv





                      share|improve this answer




























                        10














                        I'm working on AWS Redshift, which does not support the COPY TO feature.



                        My BI tool supports tab-delimited CSVs though, so I used the following:



                         psql -h  dblocation  -p port -U user  -d dbname  -F $'t' --no-align -c " SELECT *   FROM TABLE" > outfile.csv





                        share|improve this answer


























                          10












                          10








                          10






                          I'm working on AWS Redshift, which does not support the COPY TO feature.



                          My BI tool supports tab-delimited CSVs though, so I used the following:



                           psql -h  dblocation  -p port -U user  -d dbname  -F $'t' --no-align -c " SELECT *   FROM TABLE" > outfile.csv





                          share|improve this answer














                          I'm working on AWS Redshift, which does not support the COPY TO feature.



                          My BI tool supports tab-delimited CSVs though, so I used the following:



                           psql -h  dblocation  -p port -U user  -d dbname  -F $'t' --no-align -c " SELECT *   FROM TABLE" > outfile.csv






                          share|improve this answer














                          share|improve this answer



                          share|improve this answer








                          edited Apr 14 '14 at 23:15

























                          answered Mar 27 '14 at 0:16









                          calcsam

                          16125




                          16125























                              6














                              I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.



                              psql2csv [OPTIONS] < QUERY
                              psql2csv [OPTIONS] QUERY


                              The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:



                              -h, --help           show help, then exit
                              --encoding=ENCODING use a different encoding than UTF8 (Excel likes LATIN1)
                              --no-header do not output a header





                              share|improve this answer





















                              • Works great. Thank you.
                                – AlexM
                                Nov 3 '17 at 6:52
















                              6














                              I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.



                              psql2csv [OPTIONS] < QUERY
                              psql2csv [OPTIONS] QUERY


                              The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:



                              -h, --help           show help, then exit
                              --encoding=ENCODING use a different encoding than UTF8 (Excel likes LATIN1)
                              --no-header do not output a header





                              share|improve this answer





















                              • Works great. Thank you.
                                – AlexM
                                Nov 3 '17 at 6:52














                              6












                              6








                              6






                              I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.



                              psql2csv [OPTIONS] < QUERY
                              psql2csv [OPTIONS] QUERY


                              The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:



                              -h, --help           show help, then exit
                              --encoding=ENCODING use a different encoding than UTF8 (Excel likes LATIN1)
                              --no-header do not output a header





                              share|improve this answer












                              I've written a little tool called psql2csv that encapsulates the COPY query TO STDOUT pattern, resulting in proper CSV. It's interface is similar to psql.



                              psql2csv [OPTIONS] < QUERY
                              psql2csv [OPTIONS] QUERY


                              The query is assumed to be the contents of STDIN, if present, or the last argument. All other arguments are forwarded to psql except for these:



                              -h, --help           show help, then exit
                              --encoding=ENCODING use a different encoding than UTF8 (Excel likes LATIN1)
                              --no-header do not output a header






                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Sep 18 '15 at 16:42









                              fphilipe

                              3,3402033




                              3,3402033












                              • Works great. Thank you.
                                – AlexM
                                Nov 3 '17 at 6:52


















                              • Works great. Thank you.
                                – AlexM
                                Nov 3 '17 at 6:52
















                              Works great. Thank you.
                              – AlexM
                              Nov 3 '17 at 6:52




                              Works great. Thank you.
                              – AlexM
                              Nov 3 '17 at 6:52











                              5














                              If you have longer query and you like to use psql then put your query to a file and use the following command:



                              psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv





                              share|improve this answer





















                              • FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                                – CFL_Jeff
                                May 31 '18 at 19:44
















                              5














                              If you have longer query and you like to use psql then put your query to a file and use the following command:



                              psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv





                              share|improve this answer





















                              • FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                                – CFL_Jeff
                                May 31 '18 at 19:44














                              5












                              5








                              5






                              If you have longer query and you like to use psql then put your query to a file and use the following command:



                              psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv





                              share|improve this answer












                              If you have longer query and you like to use psql then put your query to a file and use the following command:



                              psql -d my_db_name -t -A -F";" -f input-file.sql -o output-file.csv






                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Sep 18 '14 at 19:52









                              Andres Kull

                              1,9121912




                              1,9121912












                              • FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                                – CFL_Jeff
                                May 31 '18 at 19:44


















                              • FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                                – CFL_Jeff
                                May 31 '18 at 19:44
















                              FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                              – CFL_Jeff
                              May 31 '18 at 19:44




                              FWIW, I had to use -F"," instead of -F";" to generate a CSV file that would open correctly in MS Excel
                              – CFL_Jeff
                              May 31 '18 at 19:44











                              5














                              I tried several things but few of them were able to give me the desired CSV with header details.



                              Here is what worked for me.



                              psql -d dbame -U username 
                              -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " >
                              OUTPUT_CSV_FILE.csv





                              share|improve this answer




























                                5














                                I tried several things but few of them were able to give me the desired CSV with header details.



                                Here is what worked for me.



                                psql -d dbame -U username 
                                -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " >
                                OUTPUT_CSV_FILE.csv





                                share|improve this answer


























                                  5












                                  5








                                  5






                                  I tried several things but few of them were able to give me the desired CSV with header details.



                                  Here is what worked for me.



                                  psql -d dbame -U username 
                                  -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " >
                                  OUTPUT_CSV_FILE.csv





                                  share|improve this answer














                                  I tried several things but few of them were able to give me the desired CSV with header details.



                                  Here is what worked for me.



                                  psql -d dbame -U username 
                                  -c "COPY ( SELECT * FROM TABLE ) TO STDOUT WITH CSV HEADER " >
                                  OUTPUT_CSV_FILE.csv






                                  share|improve this answer














                                  share|improve this answer



                                  share|improve this answer








                                  edited Nov 12 '18 at 0:40









                                  Synesso

                                  20k27107165




                                  20k27107165










                                  answered Apr 2 '18 at 8:14









                                  pyAddict

                                  35138




                                  35138























                                      3














                                      New version - psql 12 - will support --csv.




                                      psql - devel



                                      --csv



                                      Switches to CSV (Comma-Separated Values) output mode. This is equivalent to pset format csv.





                                      csv_fieldsep



                                      Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.




                                      Usage:



                                      psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

                                      psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres

                                      psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv





                                      share|improve this answer


























                                        3














                                        New version - psql 12 - will support --csv.




                                        psql - devel



                                        --csv



                                        Switches to CSV (Comma-Separated Values) output mode. This is equivalent to pset format csv.





                                        csv_fieldsep



                                        Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.




                                        Usage:



                                        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

                                        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres

                                        psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv





                                        share|improve this answer
























                                          3












                                          3








                                          3






                                          New version - psql 12 - will support --csv.




                                          psql - devel



                                          --csv



                                          Switches to CSV (Comma-Separated Values) output mode. This is equivalent to pset format csv.





                                          csv_fieldsep



                                          Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.




                                          Usage:



                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres

                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv





                                          share|improve this answer












                                          New version - psql 12 - will support --csv.




                                          psql - devel



                                          --csv



                                          Switches to CSV (Comma-Separated Values) output mode. This is equivalent to pset format csv.





                                          csv_fieldsep



                                          Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.




                                          Usage:



                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres

                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^' postgres

                                          psql -c "SELECT * FROM pg_catalog.pg_tables" --csv postgres > output.csv






                                          share|improve this answer












                                          share|improve this answer



                                          share|improve this answer










                                          answered Dec 8 '18 at 12:47









                                          Lukasz Szozda

                                          78.7k1061104




                                          78.7k1061104























                                              1














                                              JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.



                                              It lets you connect to remote databases and run SQL queries on them.



                                                                                                                                                                                                     Sourcejackdb-heroku http://static.jackdb.com/assets/img/blog/jackdb-heroku-oauth-connect.gif





                                              Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).





                                              jackdb-export



                                              Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.






                                              share|improve this answer




























                                                1














                                                JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.



                                                It lets you connect to remote databases and run SQL queries on them.



                                                                                                                                                                                                       Sourcejackdb-heroku http://static.jackdb.com/assets/img/blog/jackdb-heroku-oauth-connect.gif





                                                Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).





                                                jackdb-export



                                                Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.






                                                share|improve this answer


























                                                  1












                                                  1








                                                  1






                                                  JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.



                                                  It lets you connect to remote databases and run SQL queries on them.



                                                                                                                                                                                                         Sourcejackdb-heroku http://static.jackdb.com/assets/img/blog/jackdb-heroku-oauth-connect.gif





                                                  Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).





                                                  jackdb-export



                                                  Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.






                                                  share|improve this answer














                                                  JackDB, a database client in your web browser, makes this really easy. Especially if you're on Heroku.



                                                  It lets you connect to remote databases and run SQL queries on them.



                                                                                                                                                                                                         Sourcejackdb-heroku http://static.jackdb.com/assets/img/blog/jackdb-heroku-oauth-connect.gif





                                                  Once your DB is connected, you can run a query and export to CSV or TXT (see bottom right).





                                                  jackdb-export



                                                  Note: I'm in no way affiliated with JackDB. I currently use their free services and think it's a great product.







                                                  share|improve this answer














                                                  share|improve this answer



                                                  share|improve this answer








                                                  edited May 23 '17 at 12:18









                                                  Community

                                                  11




                                                  11










                                                  answered Apr 15 '14 at 14:50









                                                  Dennis

                                                  29.9k18104104




                                                  29.9k18104104























                                                      0














                                                      To Download CSV file with column names as HEADER use this command:



                                                      Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;





                                                      share|improve this answer


























                                                        0














                                                        To Download CSV file with column names as HEADER use this command:



                                                        Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;





                                                        share|improve this answer
























                                                          0












                                                          0








                                                          0






                                                          To Download CSV file with column names as HEADER use this command:



                                                          Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;





                                                          share|improve this answer












                                                          To Download CSV file with column names as HEADER use this command:



                                                          Copy (Select * From tableName) To '/tmp/fileName.csv' With CSV HEADER;






                                                          share|improve this answer












                                                          share|improve this answer



                                                          share|improve this answer










                                                          answered Nov 30 '18 at 8:25









                                                          murli

                                                          14710




                                                          14710























                                                              -3














                                                              import json
                                                              cursor = conn.cursor()
                                                              qry = """ SELECT details FROM test_csvfile """
                                                              cursor.execute(qry)
                                                              rows = cursor.fetchall()

                                                              value = json.dumps(rows)

                                                              with open("/home/asha/Desktop/Income_output.json","w+") as f:
                                                              f.write(value)
                                                              print 'Saved to File Successfully'





                                                              share|improve this answer

















                                                              • 3




                                                                Please expolain what you did editing answer, avoid code only answer
                                                                – GGO
                                                                Feb 27 '18 at 12:09






                                                              • 3




                                                                Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                                – Toby Speight
                                                                Feb 27 '18 at 12:48






                                                              • 2




                                                                This will produce a json file, not a csv file.
                                                                – nvoigt
                                                                Feb 27 '18 at 13:23
















                                                              -3














                                                              import json
                                                              cursor = conn.cursor()
                                                              qry = """ SELECT details FROM test_csvfile """
                                                              cursor.execute(qry)
                                                              rows = cursor.fetchall()

                                                              value = json.dumps(rows)

                                                              with open("/home/asha/Desktop/Income_output.json","w+") as f:
                                                              f.write(value)
                                                              print 'Saved to File Successfully'





                                                              share|improve this answer

















                                                              • 3




                                                                Please expolain what you did editing answer, avoid code only answer
                                                                – GGO
                                                                Feb 27 '18 at 12:09






                                                              • 3




                                                                Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                                – Toby Speight
                                                                Feb 27 '18 at 12:48






                                                              • 2




                                                                This will produce a json file, not a csv file.
                                                                – nvoigt
                                                                Feb 27 '18 at 13:23














                                                              -3












                                                              -3








                                                              -3






                                                              import json
                                                              cursor = conn.cursor()
                                                              qry = """ SELECT details FROM test_csvfile """
                                                              cursor.execute(qry)
                                                              rows = cursor.fetchall()

                                                              value = json.dumps(rows)

                                                              with open("/home/asha/Desktop/Income_output.json","w+") as f:
                                                              f.write(value)
                                                              print 'Saved to File Successfully'





                                                              share|improve this answer












                                                              import json
                                                              cursor = conn.cursor()
                                                              qry = """ SELECT details FROM test_csvfile """
                                                              cursor.execute(qry)
                                                              rows = cursor.fetchall()

                                                              value = json.dumps(rows)

                                                              with open("/home/asha/Desktop/Income_output.json","w+") as f:
                                                              f.write(value)
                                                              print 'Saved to File Successfully'






                                                              share|improve this answer












                                                              share|improve this answer



                                                              share|improve this answer










                                                              answered Feb 27 '18 at 10:56









                                                              user9279273

                                                              106




                                                              106








                                                              • 3




                                                                Please expolain what you did editing answer, avoid code only answer
                                                                – GGO
                                                                Feb 27 '18 at 12:09






                                                              • 3




                                                                Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                                – Toby Speight
                                                                Feb 27 '18 at 12:48






                                                              • 2




                                                                This will produce a json file, not a csv file.
                                                                – nvoigt
                                                                Feb 27 '18 at 13:23














                                                              • 3




                                                                Please expolain what you did editing answer, avoid code only answer
                                                                – GGO
                                                                Feb 27 '18 at 12:09






                                                              • 3




                                                                Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                                – Toby Speight
                                                                Feb 27 '18 at 12:48






                                                              • 2




                                                                This will produce a json file, not a csv file.
                                                                – nvoigt
                                                                Feb 27 '18 at 13:23








                                                              3




                                                              3




                                                              Please expolain what you did editing answer, avoid code only answer
                                                              – GGO
                                                              Feb 27 '18 at 12:09




                                                              Please expolain what you did editing answer, avoid code only answer
                                                              – GGO
                                                              Feb 27 '18 at 12:09




                                                              3




                                                              3




                                                              Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                              – Toby Speight
                                                              Feb 27 '18 at 12:48




                                                              Thank you for this code snippet, which might provide some limited short-term help. A proper explanation would greatly improve its long-term value by showing why this is a good solution to the problem, and would make it more useful to future readers with other, similar questions. Please edit your answer to add some explanation, including the assumptions you've made.
                                                              – Toby Speight
                                                              Feb 27 '18 at 12:48




                                                              2




                                                              2




                                                              This will produce a json file, not a csv file.
                                                              – nvoigt
                                                              Feb 27 '18 at 13:23




                                                              This will produce a json file, not a csv file.
                                                              – nvoigt
                                                              Feb 27 '18 at 13:23


















                                                              draft saved

                                                              draft discarded




















































                                                              Thanks for contributing an answer to Stack Overflow!


                                                              • Please be sure to answer the question. Provide details and share your research!

                                                              But avoid



                                                              • Asking for help, clarification, or responding to other answers.

                                                              • Making statements based on opinion; back them up with references or personal experience.


                                                              To learn more, see our tips on writing great answers.





                                                              Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                                              Please pay close attention to the following guidance:


                                                              • Please be sure to answer the question. Provide details and share your research!

                                                              But avoid



                                                              • Asking for help, clarification, or responding to other answers.

                                                              • Making statements based on opinion; back them up with references or personal experience.


                                                              To learn more, see our tips on writing great answers.




                                                              draft saved


                                                              draft discarded














                                                              StackExchange.ready(
                                                              function () {
                                                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f1517635%2fsave-pl-pgsql-output-from-postgresql-to-a-csv-file%23new-answer', 'question_page');
                                                              }
                                                              );

                                                              Post as a guest















                                                              Required, but never shown





















































                                                              Required, but never shown














                                                              Required, but never shown












                                                              Required, but never shown







                                                              Required, but never shown

































                                                              Required, but never shown














                                                              Required, but never shown












                                                              Required, but never shown







                                                              Required, but never shown







                                                              Popular posts from this blog

                                                              Full-time equivalent

                                                              さくらももこ

                                                              13 indicted, 8 arrested in Calif. drug cartel investigation