{"id":15098,"date":"2020-11-11T19:44:46","date_gmt":"2020-11-11T18:44:46","guid":{"rendered":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/"},"modified":"2020-11-11T19:44:46","modified_gmt":"2020-11-11T18:44:46","slug":"loading-data-from-s3-to-aws-rds-for-postgresql","status":"publish","type":"post","link":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/","title":{"rendered":"Loading data from S3 to AWS RDS for PostgreSQL"},"content":{"rendered":"<p><a href=\"https:\/\/aws.amazon.com\/rds\/postgresql\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS RDS for PostgreSQL<\/a> comes with <a href=\"https:\/\/www.postgresql.org\/docs\/current\/contrib.html\" target=\"_blank\" rel=\"noopener noreferrer\">an extension<\/a> that allows you to fetch data from <a href=\"https:\/\/aws.amazon.com\/s3\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS S3<\/a> and to write back data to <a href=\"https:\/\/aws.amazon.com\/s3\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS S3<\/a>. The use case for this is obvious: Either you use other AWS services that write data to S3 and you want to further process that data in PostgreSQL, or you want other AWS services to consume data from PostgreSQL by providing that data in S3. Let&#8217;s have a look at how that works.<\/p>\n<p><!--more--><\/p>\n<p>The extension AWS is providing for working with S3 from inside PostgreSQL is called &#8220;aws_s3&#8221;:<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1; highlight: [5]\">\npostgres=&gt; select * from pg_available_extensions where name like '%aws%';\n    name     | default_version | installed_version |                   comment                   \n-------------+-----------------+-------------------+---------------------------------------------\n aws_commons | 1.0             |                   | Common data types across AWS services\n aws_s3      | 1.0             |                   | AWS S3 extension for importing data from S3\n(2 rows)\n<\/pre>\n<p>If you try to install the extension you&#8217;ll notice that there is a dependency on the &#8220;aws_commons&#8221; extension:<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1\">\npostgres=&gt; create extension aws_s3;\nERROR:  required extension \"aws_commons\" is not installed\nHINT:  Use CREATE EXTENSION ... CASCADE to install required extensions too.\n<\/pre>\n<p>You can install both extensions in one step using the &#8220;CASCADE&#8221; option:<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1\">\npostgres=&gt; create extension aws_s3 cascade;\nNOTICE:  installing required extension \"aws_commons\"\nCREATE EXTENSION\n<\/pre>\n<p>These extensions provide a couple of helper functions (aws_commons) and the function to import a file from S3 (aws_s3):<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1\">\npostgres=&gt; dx+ aws_commons\n             Objects in extension \"aws_commons\"\n                     Object description                      \n-------------------------------------------------------------\n function aws_commons.create_aws_credentials(text,text,text)\n function aws_commons.create_s3_uri(text,text,text)\n schema aws_commons\n type aws_commons._aws_credentials_1\n type aws_commons._s3_uri_1\n(5 rows)\n\npostgres=&gt; dx+ aws_s3\n                                       Objects in extension \"aws_s3\"\n                                            Object description                                             \n-----------------------------------------------------------------------------------------------------------\n function aws_s3.table_import_from_s3(text,text,text,aws_commons._s3_uri_1,aws_commons._aws_credentials_1)\n function aws_s3.table_import_from_s3(text,text,text,text,text,text,text,text,text)\n schema aws_s3\n(3 rows)\n<\/pre>\n<p>Having the extension ready we need a file we can import, so lets create one (exactly the same file as in the <a href=\"https:\/\/www.dbi-services.com\/blog\/getting-started-with-exasol-distribution-keys\/\" target=\"_blank\" rel=\"noopener noreferrer\">previous post<\/a>, but a bit less rows):<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\ndwe@dwe:~\/Downloads$ cat gen_data.sh \n#!\/bin\/bash\n \nFILE=\"\/home\/dwe\/Downloads\/sample.csv\"\nrm -rf ${FILE}\n \nfor i in {1..1000000}; do\n    echo \"${i},firstname${i},lastname${i},xxx${i}@xxx.com,street${i},country${i},description${i}\" &gt;&gt; ${FILE}\ndone\n\ndwe@dwe:~\/Downloads$ chmod +x gen_data.sh \ndwe@dwe:~\/Downloads$ .\/gen_data.sh \ndwe@dwe:~\/Downloads$ head -5 sample.csv \n1,firstname1,lastname1,xxx1@xxx.com,street1,country1,description1\n2,firstname2,lastname2,xxx2@xxx.com,street2,country2,description2\n3,firstname3,lastname3,xxx3@xxx.com,street3,country3,description3\n4,firstname4,lastname4,xxx4@xxx.com,street4,country4,description4\n5,firstname5,lastname5,xxx5@xxx.com,street5,country5,description5\ndwe@dwe:~\/Downloads$ ls -lha sample.csv \n-rw-rw-r-- 1 dwe dwe 96M Nov 10 11:11 sample.csv\n<\/pre>\n<p>We&#8217;ll be using a new bucket for this demo, so lets create one and then upload the file we just generated:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\ndwe@dwe:~\/Downloads$ aws s3 mb s3:\/\/s3-rds-demo --region eu-central-1\nmake_bucket: s3-rds-demo\ndwe@dwe:~\/Downloads$ aws s3 cp sample.csv s3:\/\/s3-rds-demo\/\nupload: .\/sample.csv to s3:\/\/s3-rds-demo\/sample.csv         \n<\/pre>\n<p>Before we can do anything against S3 from RDS for PostgreSQL we need to setup the required permissions. You can use security credentials for this, but it is recommended to use IAM roles and policies. The first step is to create a <a href=\"https:\/\/docs.aws.amazon.com\/IAM\/latest\/UserGuide\/access_policies.html\" target=\"_blank\" rel=\"noopener noreferrer\">policy<\/a> that allows listing the bucket, read and write (write is required for writing data to S3 later on):<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\ndwe@dwe:~$ aws iam create-policy \n&gt;    --policy-name rds-s3-policy \n&gt;    --policy-document '{\n&gt;      \"Version\": \"2012-10-17\",\n&gt;      \"Statement\": [\n&gt;        {\n&gt;          \"Sid\": \"s3import\",\n&gt;          \"Action\": [\n&gt;            \"s3:GetObject\",\n&gt;            \"s3:ListBucket\",\n&gt;            \"S3:PutObject\"\n&gt;          ],\n&gt;          \"Effect\": \"Allow\",\n&gt;          \"Resource\": [\n&gt;            \"arn:aws:s3:::s3-rds-demo\", \n&gt;            \"arn:aws:s3:::s3-rds-demo\/*\"\n&gt;          ] \n&gt;        }\n&gt;      ] \n&gt;    }' \n{\n    \"Policy\": {\n        \"PolicyName\": \"rds-s3-policy\",\n        \"PolicyId\": \"ANPA2U57KX3NFH4HU4COG\",\n        \"Arn\": \"arn:aws:iam::xxxxxxxx:policy\/rds-s3-policy\",\n        \"Path\": \"\/\",\n        \"DefaultVersionId\": \"v1\",\n        \"AttachmentCount\": 0,\n        \"PermissionsBoundaryUsageCount\": 0,\n        \"IsAttachable\": true,\n        \"CreateDate\": \"2020-11-10T12:04:34+00:00\",\n        \"UpdateDate\": \"2020-11-10T12:04:34+00:00\"\n    }\n}\n<\/pre>\n<p>Once the policy is in place we create an <a href=\"https:\/\/docs.aws.amazon.com\/IAM\/latest\/UserGuide\/id_roles.html\" target=\"_blank\" rel=\"noopener noreferrer\">IAM role<\/a> which gets the policy just created attached to:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\ndwe@dwe:~$ aws iam create-role \n&gt;    --role-name rds-s3-role \n&gt;    --assume-role-policy-document '{\n&gt;      \"Version\": \"2012-10-17\",\n&gt;      \"Statement\": [\n&gt;        {\n&gt;          \"Effect\": \"Allow\",\n&gt;          \"Principal\": {\n&gt;             \"Service\": \"rds.amazonaws.com\"\n&gt;           },\n&gt;          \"Action\": \"sts:AssumeRole\"\n&gt;        }\n&gt;      ] \n&gt;    }'\n{\n    \"Role\": {\n        \"Path\": \"\/\",\n        \"RoleName\": \"rds-s3-role\",\n        \"RoleId\": \"AROA2U57KX3NP2XWVCELI\",\n        \"Arn\": \"arn:aws:iam::xxxxxxxxxx:role\/rds-s3-role\",\n        \"CreateDate\": \"2020-11-10T12:07:20+00:00\",\n        \"AssumeRolePolicyDocument\": {\n            \"Version\": \"2012-10-17\",\n            \"Statement\": [\n                {\n                    \"Effect\": \"Allow\",\n                    \"Principal\": {\n                        \"Service\": \"rds.amazonaws.com\"\n                    },\n                    \"Action\": \"sts:AssumeRole\"\n                }\n            ]\n        }\n    }\n}\n<\/pre>\n<p>Attaching the policy to the role (you will need the <a href=\"https:\/\/docs.aws.amazon.com\/general\/latest\/gr\/aws-arns-and-namespaces.html\" target=\"_blank\" rel=\"noopener noreferrer\">ARN<\/a> of the policy from above):<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\ndwe@dwe:~$ aws iam attach-role-policy \n&gt;    --policy-arn arn:aws:iam::xxxxxxxxxx:policy\/rds-s3-policy \n&gt;    --role-name rds-s3-role\n<\/pre>\n<p>Finally you need to attach the IAM role to the RDS instance by providing the ARN of the role and the identifier of your RDS instance:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\naws rds add-role-to-db-instance \n   --db-instance-identifier dwe-postgres-helvetia \n   --feature-name s3Import \n   --role-arn arn:aws:iam::xxxxxxxx:role\/rds-s3-role   \n   --region eu-central-1\n<\/pre>\n<p>Your RDS instance needs to be running to do that, otherwise you&#8217;ll get this:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1\">\nAn error occurred (InvalidDBInstanceState) when calling the AddRoleToDBInstance operation: The status for the dwe-postgres DB instance is stopped. The DB instance is not available for s3Import feature.\n<\/pre>\n<p>Having the IAM policy attached to the RDS instance we can load the csv, but first the s3 URI needs to be defined (we do not want to use access keys and credentials):<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1\">\npostgres=&gt; SELECT aws_commons.create_s3_uri('s3-rds-demo'\npostgres(&gt;                                 ,'sample.csv'\npostgres(&gt;                                 ,'eu-central-1'\npostgres(&gt;                                 ) AS s3_uri gset\npostgres=&gt; select :'s3_uri';\n               ?column?                \n---------------------------------------\n (s3-rds-demo,sample.csv,eu-central-1)\n(1 row)\n<\/pre>\n<p>No we are ready to load the file:<\/p>\n<pre class=\"brush: sql; gutter: true; first-line: 1\">\npostgres=&gt; create table sample ( id int primary key\npostgres(&gt;                              , firstname varchar(20)\npostgres(&gt;                              , lastname varchar(20)\npostgres(&gt;                              , email varchar(20)\npostgres(&gt;                              , street varchar(20)\npostgres(&gt;                              , country varchar(20)\npostgres(&gt;                              , description varchar(20)\npostgres(&gt;                              );\nCREATE TABLE\npostgres=&gt; SELECT aws_s3.table_import_from_s3 ( 'sample'\n                                   , ''\n                                   , '(format csv)'\n                                   , :'s3_uri'\n                                   );\n                                 table_import_from_s3                                 \n--------------------------------------------------------------------------------------\n 1000000 rows imported into relation \"sample\" from file sample.csv of 100222272 bytes\n(1 row)\npostgres=&gt; select * from sample limit 5;\n id |  firstname  |  lastname  |     email     |  street  |  country  |  description  \n----+-------------+------------+---------------+----------+-----------+---------------\n 77 | firstname77 | lastname77 | xxx77@xxx.com | street77 | country77 | description77\n 78 | firstname78 | lastname78 | xxx78@xxx.com | street78 | country78 | description78\n 79 | firstname79 | lastname79 | xxx79@xxx.com | street79 | country79 | description79\n  1 | firstname1  | lastname1  | xxx1@xxx.com  | street1  | country1  | description1\n  2 | firstname2  | lastname2  | xxx2@xxx.com  | street2  | country2  | description2\n(5 rows)\n<\/pre>\n<p>And we&#8217;re done. The follow up post will show the opposite: Writing back to to S3 from RDS for PostgreSQL.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>AWS RDS for PostgreSQL comes with an extension that allows you to fetch data from AWS S3 and to write back data to AWS S3. The use case for this is obvious: Either you use other AWS services that write data to S3 and you want to further process that data in PostgreSQL, or you [&hellip;]<\/p>\n","protected":false},"author":29,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1865,229],"tags":[133,77,1869,1885],"type_dbi":[],"class_list":["post-15098","post","type-post","status-publish","format-standard","hentry","category-aws","category-database-administration-monitoring","tag-aws","tag-postgresql","tag-rds","tag-s3"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.2) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Loading data from S3 to AWS RDS for PostgreSQL - dbi Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Loading data from S3 to AWS RDS for PostgreSQL\" \/>\n<meta property=\"og:description\" content=\"AWS RDS for PostgreSQL comes with an extension that allows you to fetch data from AWS S3 and to write back data to AWS S3. The use case for this is obvious: Either you use other AWS services that write data to S3 and you want to further process that data in PostgreSQL, or you [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\" \/>\n<meta property=\"og:site_name\" content=\"dbi Blog\" \/>\n<meta property=\"article:published_time\" content=\"2020-11-11T18:44:46+00:00\" \/>\n<meta name=\"author\" content=\"Daniel Westermann\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@westermanndanie\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Daniel Westermann\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\"},\"author\":{\"name\":\"Daniel Westermann\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66\"},\"headline\":\"Loading data from S3 to AWS RDS for PostgreSQL\",\"datePublished\":\"2020-11-11T18:44:46+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\"},\"wordCount\":407,\"commentCount\":1,\"keywords\":[\"AWS\",\"PostgreSQL\",\"RDS\",\"S3\"],\"articleSection\":[\"AWS\",\"Database Administration &amp; Monitoring\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\",\"url\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\",\"name\":\"Loading data from S3 to AWS RDS for PostgreSQL - dbi Blog\",\"isPartOf\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#website\"},\"datePublished\":\"2020-11-11T18:44:46+00:00\",\"author\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Accueil\",\"item\":\"https:\/\/www.dbi-services.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Loading data from S3 to AWS RDS for PostgreSQL\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#website\",\"url\":\"https:\/\/www.dbi-services.com\/blog\/\",\"name\":\"dbi Blog\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.dbi-services.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66\",\"name\":\"Daniel Westermann\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"caption\":\"Daniel Westermann\"},\"description\":\"Daniel Westermann is Principal Consultant and Technology Leader Open Infrastructure at dbi services. He has more than 15 years of experience in management, engineering and optimization of databases and infrastructures, especially on Oracle and PostgreSQL. Since the beginning of his career, he has specialized in Oracle Technologies and is Oracle Certified Professional 12c and Oracle Certified Expert RAC\/GridInfra. Over time, Daniel has become increasingly interested in open source technologies, becoming \u201cTechnology Leader Open Infrastructure\u201d and PostgreSQL expert. \u00a0Based on community or EnterpriseDB tools, he develops and installs complex high available solutions with PostgreSQL. He is also a certified PostgreSQL Plus 9.0 Professional and a Postgres Advanced Server 9.4 Professional. He is a regular speaker at PostgreSQL conferences in Switzerland and Europe. Today Daniel is also supporting our customers on AWS services such as AWS RDS, database migrations into the cloud, EC2 and automated infrastructure management with AWS SSM (System Manager). He is a certified AWS Solutions Architect Professional. Prior to dbi services, Daniel was Management System Engineer at LC SYSTEMS-Engineering AG in Basel. Before that, he worked as Oracle Developper &amp;\u00a0Project Manager at Delta Energy Solutions AG in Basel (today Powel AG). Daniel holds a diploma in Business Informatics (DHBW, Germany). His branch-related experience mainly covers the pharma industry, the financial sector, energy, lottery and telecommunications.\",\"sameAs\":[\"https:\/\/x.com\/westermanndanie\"],\"url\":\"https:\/\/www.dbi-services.com\/blog\/author\/daniel-westermann\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Loading data from S3 to AWS RDS for PostgreSQL - dbi Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/","og_locale":"en_US","og_type":"article","og_title":"Loading data from S3 to AWS RDS for PostgreSQL","og_description":"AWS RDS for PostgreSQL comes with an extension that allows you to fetch data from AWS S3 and to write back data to AWS S3. The use case for this is obvious: Either you use other AWS services that write data to S3 and you want to further process that data in PostgreSQL, or you [&hellip;]","og_url":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/","og_site_name":"dbi Blog","article_published_time":"2020-11-11T18:44:46+00:00","author":"Daniel Westermann","twitter_card":"summary_large_image","twitter_creator":"@westermanndanie","twitter_misc":{"Written by":"Daniel Westermann","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#article","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/"},"author":{"name":"Daniel Westermann","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66"},"headline":"Loading data from S3 to AWS RDS for PostgreSQL","datePublished":"2020-11-11T18:44:46+00:00","mainEntityOfPage":{"@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/"},"wordCount":407,"commentCount":1,"keywords":["AWS","PostgreSQL","RDS","S3"],"articleSection":["AWS","Database Administration &amp; Monitoring"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/","url":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/","name":"Loading data from S3 to AWS RDS for PostgreSQL - dbi Blog","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/#website"},"datePublished":"2020-11-11T18:44:46+00:00","author":{"@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66"},"breadcrumb":{"@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.dbi-services.com\/blog\/loading-data-from-s3-to-aws-rds-for-postgresql\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Accueil","item":"https:\/\/www.dbi-services.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Loading data from S3 to AWS RDS for PostgreSQL"}]},{"@type":"WebSite","@id":"https:\/\/www.dbi-services.com\/blog\/#website","url":"https:\/\/www.dbi-services.com\/blog\/","name":"dbi Blog","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.dbi-services.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66","name":"Daniel Westermann","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","caption":"Daniel Westermann"},"description":"Daniel Westermann is Principal Consultant and Technology Leader Open Infrastructure at dbi services. He has more than 15 years of experience in management, engineering and optimization of databases and infrastructures, especially on Oracle and PostgreSQL. Since the beginning of his career, he has specialized in Oracle Technologies and is Oracle Certified Professional 12c and Oracle Certified Expert RAC\/GridInfra. Over time, Daniel has become increasingly interested in open source technologies, becoming \u201cTechnology Leader Open Infrastructure\u201d and PostgreSQL expert. \u00a0Based on community or EnterpriseDB tools, he develops and installs complex high available solutions with PostgreSQL. He is also a certified PostgreSQL Plus 9.0 Professional and a Postgres Advanced Server 9.4 Professional. He is a regular speaker at PostgreSQL conferences in Switzerland and Europe. Today Daniel is also supporting our customers on AWS services such as AWS RDS, database migrations into the cloud, EC2 and automated infrastructure management with AWS SSM (System Manager). He is a certified AWS Solutions Architect Professional. Prior to dbi services, Daniel was Management System Engineer at LC SYSTEMS-Engineering AG in Basel. Before that, he worked as Oracle Developper &amp;\u00a0Project Manager at Delta Energy Solutions AG in Basel (today Powel AG). Daniel holds a diploma in Business Informatics (DHBW, Germany). His branch-related experience mainly covers the pharma industry, the financial sector, energy, lottery and telecommunications.","sameAs":["https:\/\/x.com\/westermanndanie"],"url":"https:\/\/www.dbi-services.com\/blog\/author\/daniel-westermann\/"}]}},"_links":{"self":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/15098","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/users\/29"}],"replies":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/comments?post=15098"}],"version-history":[{"count":0,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/15098\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/media?parent=15098"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/categories?post=15098"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/tags?post=15098"},{"taxonomy":"type","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/type_dbi?post=15098"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}