{"id":38831,"date":"2025-05-21T13:34:32","date_gmt":"2025-05-21T11:34:32","guid":{"rendered":"https:\/\/www.dbi-services.com\/blog\/?p=38831"},"modified":"2025-05-21T13:34:35","modified_gmt":"2025-05-21T11:34:35","slug":"using-dlt-to-get-data-from-db2-to-postgresql","status":"publish","type":"post","link":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/","title":{"rendered":"Using dlt to get data from Db2 to PostgreSQL"},"content":{"rendered":"\n<p>For a recent project at one of our customers we needed to get data from a Db2 database into PostgreSQL. The first solution we thought of was the <a href=\"https:\/\/github.com\/wolfgangbrandl\/db2_fdw\" target=\"_blank\" rel=\"noreferrer noopener\">foreign data wrapper for Db2<\/a>. This is usually easy to setup and configure and all you need are the client libraries (for Db2 in this case). But it turned out that db2_fdw is so old that it cannot be used against a recent version of PostgreSQL (we tested 15,16,17). We even fixed some of the code but it became clear very fast, that this is not the solution to go with. There is also <a href=\"https:\/\/github.com\/dalibo\/db2topg\" target=\"_blank\" rel=\"noreferrer noopener\">db2topg<\/a> but this is not as advanced as it&#8217;s brother <a href=\"https:\/\/ora2pg.darold.net\" target=\"_blank\" rel=\"noreferrer noopener\">ora2pg<\/a> and we did not even consider trying that. Another tool you can use for such tasks is <a href=\"https:\/\/dlthub.com\/product\/dlt\" target=\"_blank\" rel=\"noreferrer noopener\">dtl<\/a> (data load tool), and it turned out this is surprisingly easy to install, configure and use. You are not limited Db2 as a source, much more options are available.<\/p>\n\n\n\n<p>As the customer is using Red Hat 8 for the PostgreSQL nodes, we start with a fresh Red Hat 8 installation as well:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] cat \/etc\/os-release\nNAME=&quot;Red Hat Enterprise Linux&quot;\nVERSION=&quot;8.10 (Ootpa)&quot;\nID=&quot;rhel&quot;\nID_LIKE=&quot;fedora&quot;\nVERSION_ID=&quot;8.10&quot;\nPLATFORM_ID=&quot;platform:el8&quot;\nPRETTY_NAME=&quot;Red Hat Enterprise Linux 8.10 (Ootpa)&quot;\nANSI_COLOR=&quot;0;31&quot;\nCPE_NAME=&quot;cpe:\/o:redhat:enterprise_linux:8::baseos&quot;\nHOME_URL=&quot;https:\/\/www.redhat.com\/&quot;\nDOCUMENTATION_URL=&quot;https:\/\/access.redhat.com\/documentation\/en-us\/red_hat_enterprise_linux\/8&quot;\nBUG_REPORT_URL=&quot;https:\/\/issues.redhat.com\/&quot;\n\nREDHAT_BUGZILLA_PRODUCT=&quot;Red Hat Enterprise Linux 8&quot;\nREDHAT_BUGZILLA_PRODUCT_VERSION=8.10\nREDHAT_SUPPORT_PRODUCT=&quot;Red Hat Enterprise Linux&quot;\nREDHAT_SUPPORT_PRODUCT_VERSION=&quot;8.10&quot;\n<\/pre><\/div>\n\n\n<p>PostgreSQL 17 is already up and running:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] psql -c &quot;select version()&quot;\n                                                          version\n----------------------------------------------------------------------------------------------------------------------------\n PostgreSQL 17.5 on x86_64-pc-linux-gnu, compiled by gcc (GCC) 8.5.0 20210514 (Red Hat 8.5.0-26), 64-bit\n(1 row)\n<\/pre><\/div>\n\n\n<p>For not messing up with the Python installation from the operating system, we&#8217;ll use a <a href=\"https:\/\/docs.python.org\/3\/library\/venv.html\" target=\"_blank\" rel=\"noreferrer noopener\">Python virtual environment <\/a>for dlt and install libjpeg-turbo-devel and git as those are required later on:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,2,3,4,5]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/ &#x5B;PG1] sudo dnf install libjpeg-turbo-devel git\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] sudo dnf install python3-virtualenv -y\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] python3.12 -m venv .local\npostgres@rhel8:\/home\/postgres\/ &#x5B;PG1] .local\/bin\/pip3 install --upgrade pip\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] . .local\/bin\/activate\n<\/pre><\/div>\n\n\n<p>Once we have the Python virtual environment ready and activated, the installation of dlt is just a matter of asking <a href=\"https:\/\/pypi.org\/project\/pip\/\">pip<\/a> to install it for us (for this you need access to the internet, of course):<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,2]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/u02\/pgdata\/17\/ &#x5B;PG1] .local\/bin\/pip3 install -U &quot;dlt&#x5B;postgres]&quot;\npostgres@rhel8:\/home\/postgres\/ &#x5B;PG1] which dlt\n~\/.local\/bin\/dlt\n<\/pre><\/div>\n\n\n<p>Having that installed we can initialize a new pipeline based on the <a href=\"https:\/\/dlthub.com\/docs\/tutorial\/sql-database\" target=\"_blank\" rel=\"noreferrer noopener\">sql_database template<\/a> and we want &#8220;postgres&#8221; as the destination:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,2,5,14]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/ &#x5B;PG1] mkdir db2_postgresql &amp;&amp; cd $_\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] dlt init sql_database postgres\nCreating a new pipeline with the dlt core source sql_database (Source that loads tables form any SQLAlchemy supported database, supports batching requests and incremental loads.)\nNOTE: Beginning with dlt 1.0.0, the source sql_database will no longer be copied from the verified sources repo but imported from dlt.sources. You can provide the --eject flag to revert to the old behavior.\nDo you want to proceed? &#x5B;Y\/n]: y\n\nYour new pipeline sql_database is ready to be customized!\n* Review and change how dlt loads your data in sql_database_pipeline.py\n* Add credentials for postgres and other secrets to .\/.dlt\/secrets.toml\n* requirements.txt was created. Install it with:\npip3 install -r requirements.txt\n* Read https:\/\/dlthub.com\/docs\/walkthroughs\/create-a-pipeline for more information\n\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] ls -l\ntotal 20\n-rw-r--r--. 1 postgres postgres    34 May 21 09:07 requirements.txt\n-rw-r--r--. 1 postgres postgres 12834 May 21 09:07 sql_database_pipeline.py\n<\/pre><\/div>\n\n\n<p>As mentioned in the output above, additional dependencies need to be installed:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,3,4,13]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] cat requirements.txt \ndlt&#x5B;postgres,sql-database]&gt;=1.11.0(.local)\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] pip install -r requirements.txt\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] find .\n.\n.\/.dlt\n.\/.dlt\/config.toml\n.\/.dlt\/secrets.toml\n.\/.gitignore\n.\/sql_database_pipeline.py\n.\/requirements.txt\n\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] pip install ibm-db-sa\n<\/pre><\/div>\n\n\n<p>Now is the time to configure the credentials and connection parameters for the source and destination databases, and this is done in the &#8220;secrets.toml&#8221; file:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] cat .dlt\/secrets.toml \n&#x5B;sources.sql_database.credentials]\ndrivername = &quot;db2+ibm_db&quot;\ndatabase = &quot;db1&quot; \npassword = &quot;manager&quot; \nusername = &quot;db2inst1&quot; \nschema = &quot;omrun&quot;\nhost = &quot;172.22.11.93&quot;\nport = 25010 \n\n&#x5B;destination.postgres.credentials]\ndatabase = &quot;postgres&quot; \npassword = &quot;postgres&quot;\nusername = &quot;postgres&quot;\nhost = &quot;192.168.122.60&quot;\nport = 5432\nconnect_timeout = 15\n<\/pre><\/div>\n\n\n<p>When we initialized the pipeline a template called &#8220;sql_database_pipeline.py&#8221; was created, and this is what we need to adjust now. There are several samples in that template, we&#8217;ve used the load_select_tables_from_database skeleton:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\n# flake8: noqa\nimport humanize\nfrom typing import Any\nimport os\n\nimport dlt\nfrom dlt.common import pendulum\nfrom dlt.sources.credentials import ConnectionStringCredentials\n\nfrom dlt.sources.sql_database import sql_database, sql_table, Table\n\nfrom sqlalchemy.sql.sqltypes import TypeEngine\nimport sqlalchemy as sa\n\n\ndef load_select_tables_from_database() -&gt; None:\n    &quot;&quot;&quot;Use the sql_database source to reflect an entire database schema and load select tables from it.\n\n    This example sources data from the public Rfam MySQL database.\n    &quot;&quot;&quot;\n    # Create a pipeline\n    pipeline = dlt.pipeline(pipeline_name=&quot;omrun&quot;, destination=&#039;postgres&#039;, dataset_name=&quot;omrun&quot;)\n\n    # This are the tables we want to load\n    source_1 = sql_database(schema=&quot;omrun&quot;).with_resources(&quot;loadcheck_a&quot;, &quot;loadcheck_b&quot;)\n\n    # Run the pipeline. The merge write disposition merges existing rows in the destination by primary key\n    info = pipeline.run(source_1, write_disposition=&quot;replace&quot;)\n    print(info)\n\nif __name__ == &quot;__main__&quot;:\n    # Load selected tables with different settings\n    load_select_tables_from_database()\n<\/pre><\/div>\n\n\n<p>That&#8217;s all the code which is required for this simple use case. We&#8217;ve specified the database schema (omrun) and the two tables we want to load the data from (&#8220;loadcheck_a&#8221;, &#8220;loadcheck_b&#8221;). In addition we want the data to be replaced on the target (there is also merge and append).<\/p>\n\n\n\n<p>This is how it looks like in Db2 for the first table:<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"778\" height=\"462\" src=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png\" alt=\"\" class=\"wp-image-38839\" srcset=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png 778w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329-300x178.png 300w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329-768x456.png 768w\" sizes=\"auto, (max-width: 778px) 100vw, 778px\" \/><\/figure>\n\n\n\n<p>Ready to run the pipeline:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] python sql_database_pipeline.py \nPipeline omrun load step completed in 0.73 seconds\n1 load package(s) were loaded to destination postgres and into dataset omrun\nThe postgres destination used postgresql:\/\/postgres:***@192.168.122.60:5432\/postgres location to store data\nLoad package 1747817065.3199458 is LOADED and contains no failed jobs\n<\/pre><\/div>\n\n\n<p>Everything seems to be OK, let&#8217;s check in PostgreSQL. The schema &#8220;omrun&#8221; was created automatically:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] psql -c &quot;\\dn&quot;\n          List of schemas\n     Name      |       Owner       \n---------------+-------------------\n omrun         | postgres\n omrun_staging | postgres\n public        | pg_database_owner\n(3 rows)\n<\/pre><\/div>\n\n\n<p>Looking at the tables in that schema, both tables are there and contain the data:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,13,19]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] psql -c &quot;set search_path=&#039;omrun&#039;&quot; -c &quot;\\d&quot;\nSET\n                List of relations\n Schema |        Name         | Type  |  Owner   \n--------+---------------------+-------+----------\n omrun  | _dlt_loads          | table | postgres\n omrun  | _dlt_pipeline_state | table | postgres\n omrun  | _dlt_version        | table | postgres\n omrun  | loadcheck_a         | table | postgres\n omrun  | loadcheck_b         | table | postgres\n(5 rows)\n\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] psql -c &quot;select count(*) from omrun.loadcheck_a&quot;\n count  \n--------\n 102401\n(1 row)\n\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] psql -c &quot;select * from omrun.loadcheck_a limit 5&quot;\n spalte0 | spalte1 |    _dlt_load_id    |    _dlt_id     \n---------+---------+--------------------+----------------\n       1 | test1   | 1747817065.3199458 | tmQTbuEnpjoJ8Q\n       2 | test2   | 1747817065.3199458 | Y5D4aEbyZmaDVw\n       3 | test3   | 1747817065.3199458 | RxcyPugGndIRQA\n       4 | test4   | 1747817065.3199458 | YHcJLkKML48\/8g\n       5 | test5   | 1747817065.3199458 | ywNZhazXRAlFnQ\n(5 rows)\n<\/pre><\/div>\n\n\n<p>Two additional columns have been added to the tables. &#8220;_dlt_load_id&#8221; and &#8220;_dlt_id&#8221; are not there in Db2, but get added automatically by dlt for internal purposes. The same is true for the &#8220;omrun_staging&#8221; schema.<\/p>\n\n\n\n<p>Inspecting the pipeline can be done with the &#8220;info&#8221; command:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] dlt pipeline omrun info \nFound pipeline omrun in \/home\/postgres\/.dlt\/pipelines\nSynchronized state:\n_state_version: 2\n_state_engine_version: 4\ndataset_name: omrun\nschema_names: &#x5B;&#039;sql_database&#039;]\npipeline_name: omrun\ndefault_schema_name: sql_database\ndestination_type: dlt.destinations.postgres\ndestination_name: None\n_version_hash: e\/mg52\/UONZ79Z5wrl8THEl8LeuKw+xQlA8FqYvgdaU=\n\nsources:\nAdd -v option to see sources state. Note that it could be large.\n\nLocal state:\nfirst_run: False\ninitial_cwd: \/home\/postgres\/db2_postgresql\n_last_extracted_at: 2025-05-21 07:52:50.530143+00:00\n_last_extracted_hash: e\/mg52\/UONZ79Z5wrl8THEl8LeuKw+xQlA8FqYvgdaU=\n\nResources in schema: sql_database\nloadcheck_a with 1 table(s) and 0 resource state slot(s)\nloadcheck_b with 1 table(s) and 0 resource state slot(s)\n\nWorking dir content:\nHas 6 completed load packages with following load ids:\n1747813450.4990926\n1747813500.9859562\n1747813559.5663254\n1747813855.3201842\n1747813968.0540593\n1747817065.3199458\n\nPipeline has last run trace. Use &#039;dlt pipeline omrun trace&#039; to inspect \n<\/pre><\/div>\n\n\n<p>If you install the &#8220;streamlit&#8221; package, you can even bring up a website and inspect your data using the browser:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: bash; highlight: [1,2]; title: ; notranslate\" title=\"\">\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] pip install streamlit\npostgres@rhel8:\/home\/postgres\/db2_postgresql\/ &#x5B;PG1] dlt pipeline omrun show\nFound pipeline omrun in \/home\/postgres\/.dlt\/pipelines\n\nCollecting usage statistics. To deactivate, set browser.gatherUsageStats to false.\n\n\n  You can now view your Streamlit app in your browser.\n\n  Local URL: http:\/\/localhost:8501\n  Network URL: http:\/\/192.168.122.60:8501\n  External URL: http:\/\/146.4.101.46:8501\n<\/pre><\/div>\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"578\" src=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630-1024x578.png\" alt=\"\" class=\"wp-image-38840\" srcset=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630-1024x578.png 1024w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630-300x169.png 300w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630-768x433.png 768w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630-1536x866.png 1536w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105630.png 1663w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"578\" src=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718-1024x578.png\" alt=\"\" class=\"wp-image-38841\" srcset=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718-1024x578.png 1024w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718-300x169.png 300w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718-768x433.png 768w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718-1536x866.png 1536w, https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_105718.png 1663w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Really nice.<\/p>\n\n\n\n<p>This was a very simple example, there is much more can do with dlt. Check the <a href=\"https:\/\/dlthub.com\/docs\/intro\" target=\"_blank\" rel=\"noreferrer noopener\">documentation<\/a> for further details.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>For a recent project at one of our customers we needed to get data from a Db2 database into PostgreSQL. The first solution we thought of was the foreign data wrapper for Db2. This is usually easy to setup and configure and all you need are the client libraries (for Db2 in this case). But [&hellip;]<\/p>\n","protected":false},"author":29,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[229,198],"tags":[],"type_dbi":[2749],"class_list":["post-38831","post","type-post","status-publish","format-standard","hentry","category-database-administration-monitoring","category-database-management","type-postgresql"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.4) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>Using dlt to get data from Db2 to PostgreSQL - dbi Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Using dlt to get data from Db2 to PostgreSQL\" \/>\n<meta property=\"og:description\" content=\"For a recent project at one of our customers we needed to get data from a Db2 database into PostgreSQL. The first solution we thought of was the foreign data wrapper for Db2. This is usually easy to setup and configure and all you need are the client libraries (for Db2 in this case). But [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/\" \/>\n<meta property=\"og:site_name\" content=\"dbi Blog\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-21T11:34:32+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-05-21T11:34:35+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png\" \/>\n\t<meta property=\"og:image:width\" content=\"778\" \/>\n\t<meta property=\"og:image:height\" content=\"462\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Daniel Westermann\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@westermanndanie\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Daniel Westermann\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/\"},\"author\":{\"name\":\"Daniel Westermann\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/#\\\/schema\\\/person\\\/8d08e9bd996a89bd75c0286cbabf3c66\"},\"headline\":\"Using dlt to get data from Db2 to PostgreSQL\",\"datePublished\":\"2025-05-21T11:34:32+00:00\",\"dateModified\":\"2025-05-21T11:34:35+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/\"},\"wordCount\":558,\"commentCount\":0,\"image\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2025\\\/05\\\/Screenshot_20250521_104329.png\",\"articleSection\":[\"Database Administration &amp; Monitoring\",\"Database management\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/\",\"url\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/\",\"name\":\"Using dlt to get data from Db2 to PostgreSQL - dbi Blog\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2025\\\/05\\\/Screenshot_20250521_104329.png\",\"datePublished\":\"2025-05-21T11:34:32+00:00\",\"dateModified\":\"2025-05-21T11:34:35+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/#\\\/schema\\\/person\\\/8d08e9bd996a89bd75c0286cbabf3c66\"},\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2025\\\/05\\\/Screenshot_20250521_104329.png\",\"contentUrl\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/wp-content\\\/uploads\\\/sites\\\/2\\\/2025\\\/05\\\/Screenshot_20250521_104329.png\",\"width\":778,\"height\":462},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/using-dlt-to-get-data-from-db2-to-postgresql\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Accueil\",\"item\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Using dlt to get data from Db2 to PostgreSQL\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/#website\",\"url\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/\",\"name\":\"dbi Blog\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/#\\\/schema\\\/person\\\/8d08e9bd996a89bd75c0286cbabf3c66\",\"name\":\"Daniel Westermann\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g\",\"caption\":\"Daniel Westermann\"},\"description\":\"Daniel Westermann is Principal Consultant and Technology Leader Open Infrastructure at dbi services. He has more than 15 years of experience in management, engineering and optimization of databases and infrastructures, especially on Oracle and PostgreSQL. Since the beginning of his career, he has specialized in Oracle Technologies and is Oracle Certified Professional 12c and Oracle Certified Expert RAC\\\/GridInfra. Over time, Daniel has become increasingly interested in open source technologies, becoming \u201cTechnology Leader Open Infrastructure\u201d and PostgreSQL expert. \u00a0Based on community or EnterpriseDB tools, he develops and installs complex high available solutions with PostgreSQL. He is also a certified PostgreSQL Plus 9.0 Professional and a Postgres Advanced Server 9.4 Professional. He is a regular speaker at PostgreSQL conferences in Switzerland and Europe. Today Daniel is also supporting our customers on AWS services such as AWS RDS, database migrations into the cloud, EC2 and automated infrastructure management with AWS SSM (System Manager). He is a certified AWS Solutions Architect Professional. Prior to dbi services, Daniel was Management System Engineer at LC SYSTEMS-Engineering AG in Basel. Before that, he worked as Oracle Developper &amp;\u00a0Project Manager at Delta Energy Solutions AG in Basel (today Powel AG). Daniel holds a diploma in Business Informatics (DHBW, Germany). His branch-related experience mainly covers the pharma industry, the financial sector, energy, lottery and telecommunications.\",\"sameAs\":[\"https:\\\/\\\/x.com\\\/westermanndanie\"],\"url\":\"https:\\\/\\\/www.dbi-services.com\\\/blog\\\/author\\\/daniel-westermann\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Using dlt to get data from Db2 to PostgreSQL - dbi Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/","og_locale":"en_US","og_type":"article","og_title":"Using dlt to get data from Db2 to PostgreSQL","og_description":"For a recent project at one of our customers we needed to get data from a Db2 database into PostgreSQL. The first solution we thought of was the foreign data wrapper for Db2. This is usually easy to setup and configure and all you need are the client libraries (for Db2 in this case). But [&hellip;]","og_url":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/","og_site_name":"dbi Blog","article_published_time":"2025-05-21T11:34:32+00:00","article_modified_time":"2025-05-21T11:34:35+00:00","og_image":[{"width":778,"height":462,"url":"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png","type":"image\/png"}],"author":"Daniel Westermann","twitter_card":"summary_large_image","twitter_creator":"@westermanndanie","twitter_misc":{"Written by":"Daniel Westermann","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#article","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/"},"author":{"name":"Daniel Westermann","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66"},"headline":"Using dlt to get data from Db2 to PostgreSQL","datePublished":"2025-05-21T11:34:32+00:00","dateModified":"2025-05-21T11:34:35+00:00","mainEntityOfPage":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/"},"wordCount":558,"commentCount":0,"image":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#primaryimage"},"thumbnailUrl":"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png","articleSection":["Database Administration &amp; Monitoring","Database management"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/","url":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/","name":"Using dlt to get data from Db2 to PostgreSQL - dbi Blog","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#primaryimage"},"image":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#primaryimage"},"thumbnailUrl":"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png","datePublished":"2025-05-21T11:34:32+00:00","dateModified":"2025-05-21T11:34:35+00:00","author":{"@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66"},"breadcrumb":{"@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#primaryimage","url":"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png","contentUrl":"https:\/\/www.dbi-services.com\/blog\/wp-content\/uploads\/sites\/2\/2025\/05\/Screenshot_20250521_104329.png","width":778,"height":462},{"@type":"BreadcrumbList","@id":"https:\/\/www.dbi-services.com\/blog\/using-dlt-to-get-data-from-db2-to-postgresql\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Accueil","item":"https:\/\/www.dbi-services.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Using dlt to get data from Db2 to PostgreSQL"}]},{"@type":"WebSite","@id":"https:\/\/www.dbi-services.com\/blog\/#website","url":"https:\/\/www.dbi-services.com\/blog\/","name":"dbi Blog","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.dbi-services.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d08e9bd996a89bd75c0286cbabf3c66","name":"Daniel Westermann","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/31350ceeecb1dd8986339a29bf040d4cd3cd087d410deccd8f55234466d6c317?s=96&d=mm&r=g","caption":"Daniel Westermann"},"description":"Daniel Westermann is Principal Consultant and Technology Leader Open Infrastructure at dbi services. He has more than 15 years of experience in management, engineering and optimization of databases and infrastructures, especially on Oracle and PostgreSQL. Since the beginning of his career, he has specialized in Oracle Technologies and is Oracle Certified Professional 12c and Oracle Certified Expert RAC\/GridInfra. Over time, Daniel has become increasingly interested in open source technologies, becoming \u201cTechnology Leader Open Infrastructure\u201d and PostgreSQL expert. \u00a0Based on community or EnterpriseDB tools, he develops and installs complex high available solutions with PostgreSQL. He is also a certified PostgreSQL Plus 9.0 Professional and a Postgres Advanced Server 9.4 Professional. He is a regular speaker at PostgreSQL conferences in Switzerland and Europe. Today Daniel is also supporting our customers on AWS services such as AWS RDS, database migrations into the cloud, EC2 and automated infrastructure management with AWS SSM (System Manager). He is a certified AWS Solutions Architect Professional. Prior to dbi services, Daniel was Management System Engineer at LC SYSTEMS-Engineering AG in Basel. Before that, he worked as Oracle Developper &amp;\u00a0Project Manager at Delta Energy Solutions AG in Basel (today Powel AG). Daniel holds a diploma in Business Informatics (DHBW, Germany). His branch-related experience mainly covers the pharma industry, the financial sector, energy, lottery and telecommunications.","sameAs":["https:\/\/x.com\/westermanndanie"],"url":"https:\/\/www.dbi-services.com\/blog\/author\/daniel-westermann\/"}]}},"_links":{"self":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/38831","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/users\/29"}],"replies":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/comments?post=38831"}],"version-history":[{"count":8,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/38831\/revisions"}],"predecessor-version":[{"id":38852,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/38831\/revisions\/38852"}],"wp:attachment":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/media?parent=38831"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/categories?post=38831"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/tags?post=38831"},{"taxonomy":"type","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/type_dbi?post=38831"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}