{"id":11630,"date":"2018-09-07T10:38:44","date_gmt":"2018-09-07T08:38:44","guid":{"rendered":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/"},"modified":"2025-10-24T09:26:21","modified_gmt":"2025-10-24T07:26:21","slug":"a-sqlite-extension-for-gawk-part-iii","status":"publish","type":"post","link":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/","title":{"rendered":"A SQLite extension for gawk (part III)"},"content":{"rendered":"<p>Welcome to part III of a three-part article on extending gawk with a SQLite binding. Part I is <a title=\"A SQLite extension for gawk (part I)\" href=\"https:\/\/www.dbi-services.com\/blog\/?p=27308\" target=\"_blank\" rel=\"noopener\">here<\/a>. Part II is <a title=\"A SQLite extension for gawk (part II)\" href=\"https:\/\/www.dbi-services.com\/blog\/?p=27406\" target=\"_blank\" rel=\"noopener\">here<\/a>.<br \/>\nIn this final part, I&#8217;ll explain succinctly the code that was listed in part II.<br \/>\nThis part is best read while having Part II opened in another browser window.<\/p>\n<h3>sqlite_gawk.c<\/h3>\n<p>Line 48: All the operations on a database requires a handle that is initialized once the database is successfully opened. The handle is of type (sqlite3 *). Obviously, gawk does not know pointers; therefore what gets returned to gawk after a database has been opened is an integer which is an index into a static array of (sqlite3 *). The array has a fixed size of 100 for up to 100 concurrently opened database, which should be largely enough (if not, just increase MAX_DB on line 46 and recompile, or implement a dynamically resizable array instead). When a database is closed, the free slot is of course recycled for the next database to open.<\/p>\n<p>Line 56: Some initialization is performed, mostly of the array of database handles.<\/p>\n<p>Lines 77 to 315: this code has been lifted from sqlite3.c, the SQLite shell, and only slightly edited. It provides the SQL extension functions readfile() (read file into a blob) and writefile() (dumps a blob into file) for working with blobs from SQL statements. These functions can then be used seamlessly as follows:<br \/>\n<code><br \/>\nINSERT INTO test_with_blob(n1, my_blob) VALUES(1, readfile(\"gawk-4.2.1.tar.gz\")<br \/>\nSELECT n1, writefile('yy' || rowid, my_blob) FROM test_with_blob<br \/>\nUPDATE test_with_blob set my_blob = readfile('\/home\/dmadmin\/dmgawk\/gawk-4.2.1\/extension\/sqlite_gawk.c') where n1 = 1000<br \/>\n<\/code><br \/>\nSQLite makes it possible to extend the set of functions that can be called in SQL statements. The function sqlite3_create_function() allows to register a new user-defined function, which must comply with a standard profile. Afterwards, they are invoked by SQLite as callbacks if they are present in SQL statements. This is not an exclusive feature since Oracle for example allows such functions but they are stored functions written in PL\/SQL (or in java or C\/C++ for external procedures with a PL\/SQL interface to the database). SQLite even allows those user-defined functions to perform aggregation like the usual sum(), count() or avg() functions, which is useful for example to supply the missing stddev() function.<br \/>\nreadfile() works with a buffer allocated from memory with the same size as the file to read, which can can be problematic with huge files, but blob size is currently limited to about 10^9 bytes by the compilation, and to about 2 GiB by the implementation (see <a title=\"A SQLite Limits\" href=\"https:\/\/sqlite.org\/limits.html\" target=\"_blank\" rel=\"noopener\">the limits here<\/a>). A readfile() test with a 900 millions bytes file worked flawlessly (see the test script tsqlite.awk). In memory challenged machines, SQLite incremental blob I\/O functions have been put to good use to work with smaller buffers (see overloadd function sqlite_exec() starting at line 406).<\/p>\n<p>Line 323: get_free_sqlite_handle() is called each time a database is opened. It returns the index of the next free slot in the array of database handles. It does a sequential search in the array but again a maximum of 100 opened database are currently supported, so it is acceptable.<\/p>\n<p>Line 335: do_sqllite_open() is the interface&#8217;s wrapper around SQLite sqlite3_open(). It passes it the name of the database and, if successful, receives a database handle. SQLite databases are entirely contained in one file and no password are needed to open it. The received handle is saved into the array of handles and its index is returned to gawk for future use. When gawk calls another function for the opened database, it passes this index to the function which looks into the array for the corresponding handle, which is used to talk to SQlite functions.<br \/>\nIn do_sqllite_open(), the readfile() and writefile() user-defined functions are registered to be used in SQL statements as callbacks.<\/p>\n<p>Line 387: do_sqlite_close() closes a database and marks the slot in the array of handles as available. Nothing special here.<\/p>\n<p>Line 406: do_sqllite_exec(). This is a big one as, in addition to the expected DML\/DDL statements, it also deals with incremental blob I\/Os. It implements both acceptations of the gawk function sqlite_exec(). Both are distinguishable from each other by the number of parameters they take:<br \/>\n1. int sqlite_exec(db, stmt);<br \/>\n2. sqlite_exec(db, db_name, table, blob_column, rowid, &#8220;readfile(file_name)&#8221; | &#8220;writefile(file_name)&#8221;);<br \/>\nThe first version takes 2 parameters, the database handle and the expected DML\/DDL statement to send to the database. It returns 0 if OK and -1 if not.<br \/>\nThe second one is for incremental blob I\/Os and takes 6 parameters:<br \/>\nthe database handle;<br \/>\nthe database name; this is an surprising one but is required later by SQLite in function sqlite3_blob_open() documented <a title=\"sqlite3_blob_open\" href=\"https:\/\/sqlite.org\/c3ref\/blob_open.html\" target=\"_blank\" rel=\"noopener\">here<\/a>. It is a symbolic name (not the file name of the opened database) and must be set to either &#8220;main&#8221; for the main database, to &#8220;temp&#8221; for a temporary table, and to the name after the AS clause for ATTACHed databases (see <a title=\"ATTACH DATABASE\" href=\"https:\/\/sqlite.org\/lang_attach.html\" target=\"_blank\" rel=\"noopener\">here<\/a>).<br \/>\nthe triplet (table, column, rowid) is also required to identify the blob to access;<br \/>\nthe last parameter tells the operation to perform on the blob, either update it with a content of from a file (parameter &#8220;readfile(filename)&#8221;) or dump it to a file (parameter &#8220;writefile(filename&#8221;)). Here, contrary to what has been said for lines 77 to 315, readfile and writefile are not user-defined SQL extensions but just place-holders for the requested operations. An alternative syntax could have been:<br \/>\n<code><br \/>\nsqlite_exec(db, db_name, table, blob_column, rowid, \"read\" | \"write\", file_name);<br \/>\nor<br \/>\nsqlite_exec(db, db_name, table, blob_column, rowid, \"load\" | \"dump\", file_name);<br \/>\n<\/code><br \/>\nbut it was felt that reusing the verbs readfile() and writefile() was more standard, albeit admittedly a bit weird.<br \/>\nThe filename between the parentheses should not be quoted for the real parameters to sqlite_exec() are the strings &#8220;readfile(filename)&#8221; or &#8220;writefile(filename)&#8221;. Internally, cf. starting at line 500, they are parsed and the operation and the filename get extracted from the strings; see regular expressions strings *RE_readfile and *RE_writefile.<br \/>\nWhen reading into an existing blob, SQLite does not allow its size to be changed in function sqlite3_blob_write(), therefore the blob is first UPDATEd with a value of the new size. This is done by assigning it zero bytes via the SQLite function zeroblob() during the UPDATE; this function is efficient in that it does not need to first instantiate a memory buffer of that size and fill it with zeros before assigning it to the blob. Once the blob is updated, it must be reopened.<\/p>\n<p>Line 573 and 605: For an efficient transfer of data between the file and the blob, a 10 MiB large memory buffer is allocated.<br \/>\nOverall, there are a lot of goto statements in this function. Don&#8217;t be shocked or disgusted and rather consider them as &#8220;exit <label>&#8221; or &#8220;break <label>&#8221; statements found in other programming languages. All these gotos just exit nested blocks and therefore go all the same direction: strictly downwards. No spaghetti code here.<br \/>\nThe function is also quite large at almost 240 lines, but most of it is error handling and clean up code. To compact the function somewhat, the read and write parts could be split in 2 functions that are called only once; it&#8217;s a matter of personal preferences. I think that most editors nowadays can display more than 25&#215;80 lines\/characters of code so let&#8217;s take advantage of this bleeding-edge technology !<\/p>\n<p>Line 678: All the data functions that displays results from a SQL SELECT statement are actually callbacks; they are invoked by SQLite sqlite3_exec() from within the extension do_sqlite_select() and must comply to a precise profile. In order to keep some control of their behavior from our own code, it is possible to pass them a pointer to some data, which is the purpose of the DISPLAYED_TABLE C struct. It contains formatting parameters and on familiar-to-awker variable, NR, which gets updated by the callbacks at each call. sqlite3_exec() invokes the callback for each row that it fetches from the database, and the callback (all the select_callback_* except select_callback_array which stores it into a gawk associative array, more on this later) displays that row according to the parameters in struct DISPLAYED_TABLE. The callbacks are:<br \/>\n<code><br \/>\nselect_callback_raw()<br \/>\nselect_callback_draft()<br \/>\nselect_callback_sized()<br \/>\nselect_callback_array()<br \/>\n<\/code><br \/>\nEach of these displays the rows with its particular format, as discussed in <a title=\"A SQLite extension for gawk (part I)\" href=\"https:\/\/www.dbi-services.com\/blog\/?p=27308\" target=\"_blank\" rel=\"noopener\">here<\/a>, so I won&#8217;t repeat myself here. As it is expected, presentation code is always quite lengthy, especially for all this line-truncation with or without ellipsis, and line-wrapping.<\/p>\n<p>Line 960: An interesting functionality, much shorter to implement since no presentation is required, is storing the rows into a gawk associative array, done in function select_callback_array(). When gawk invokes sqlite_select() with 4 parameters (the 3rd is a dummy one introduced just to tell the acceptations apart), select_callback_array() is selected as sqlite3_exec()&#8217;s callback, e.g.:<br \/>\n<code><br \/>\nsqlite_select(db, \"select rowid from test_with_blob where n1 = 1000 limit 1\", 0, a_test)<br \/>\n<\/code><br \/>\nThe associative array a_test receives the rows and columns and stores them as shown in the function&#8217;s comment. On lines 1000 to 1011, another approach was attempted, which however I couldn&#8217;t make work. This is not so bad as it looked a bit weird. It consisted in programmatically creating the gawk array (with a the given name) and injecting it into the gawk script so that it would magically appear in the script&#8217;s global variables. The gawk call would have been:<br \/>\n<code><br \/>\nsqlite_select(db, \"select rowid from test_with_blob where n1 = 1000 limit 1\", 0, <strong><em>\"a_test\"<\/em><\/strong>)<br \/>\n<\/code><br \/>\nNote the string &#8220;a_test&#8221;, the name of the array to create. Upon returning, an array with that name would have been created and added to the script&#8217;s symbol table, and could be accessed as usual through a_test[<em>line<\/em>][<em>col<\/em>]. Currently, the array must already exist and have been &#8220;typed&#8221; as an array in the gawk script, which is more natural. The array is then filled using gawk&#8217;s extension API as documented in the Gawk Manual.<\/p>\n<p>Line 1190: do_sqlite_select() terminates by directly invoking the callbacks to give them the opportunity to print an epilogue such as the number of processed rows or optimum column widths (by select_callback_draft()).<br \/>\nLine 1206: the extension functions are registered in gawk as usual. Note the varying arities, do_sqlite_exec() takes from 2 to 6 parameters while do_sqlite_select() takes from  2 to 4 parameters.<\/p>\n<h3>tsqlite.awk<\/h3>\n<p>This is the gawk test script.<\/p>\n<p>Line 6: the SQLite extension is loaded here. The name is the one that was given to it on line 1218 of sqlite_gawk.c.<\/p>\n<p>Line 9 to 20: Database opening is tested here. Apparently, it is not a problem to have the same database opened several times. In effect, SQLite allows concurrent reading but locks the entire database when a writing is performed; its concurrency is far limited here but in practice it is not a problem as the locks stay for a few milliseconds only.<\/p>\n<p>Line 23: sqlite_exec() is tested with a DDL statement. Actually, SQLite allows to insert any value in a column, even though the types and sizes don&#8217;t match. It uses type affinity to convert between types (see <a title=\"SQLite Types\" href=\"https:\/\/sqlite.org\/datatype3.html\" target=\"_blank\" rel=\"noopener\">here<\/a> for details). It&#8217;s even possible to store a blob in a CHAR(10) column.<\/p>\n<p>Lines 26 to 37: sqlite_exec() is tested with a few DML statements.<\/p>\n<p>Lines 40 to 51: The select into a gawk array feature is tested. A dummy 3rd parameter is inserted to force the select_callback_array() callback to be invoked. The result is output through 2 ways: the dumparray() function from the manual (its code is given at the end of the script), a function that uses the gawk isarray() introspection function to recursively display an array (and possibly, of array of &#8230; etc.). This test confirms that the receiving array has the right structure, an integer-indexed array of associative arrays indexed by column names.<\/p>\n<p>Lines 53 to 112: we test the different output formats from the other 3 callbacks select_callback_raw(), select_callback_draft() and select_callback_sized(). As said, these functions are selected based on the type and number of parameters of the overloaded function sqlite_select()<\/p>\n<p>Line 114: sqlite_exec() tests the DELETE DML statement. To be noted that SQLite performs a TRUNCATE TABLE optimization here when no condition is present (i.e. there is no WHERE clause).<\/p>\n<p>Lines 126 to 181: we test the blob operations through the SQL extension functions readfile() and writefile(). Files of different sizes, up to 990 million bytes are stuffed into a blob with no problem, except the ingestion time is quite perceptible for big files. As said before, the implementation of the readfile() function that reads data from a disk file int a blob allocated a buffer in memory to entirely held the file. If it fails, incremental blob I\/O functions are available, read on.<\/p>\n<p>Lines 183 to 200: They test the incremental blob I\/O functions. These allow to work around any memory size problem as they use a fixed-sized 10 MiB buffer insted of allocating one the size of the file to read in or one for the blob to dump out. As sqlite_exec() needs here the blob&#8217;s rowid, it must first be read through a SELECT statement. The received value is passed into a gawk array a_test in a_test[0][&#8220;rowid&#8221;], ie. the result has one row only (stored in a_test[0]) and this row has only one column, named &#8220;rowid&#8221;. The currently opened database has the alias &#8220;main&#8221;. On line 191, note how writefile(<em>filename<\/em>) is quoted so the function is not evaluated by the gawk interpreter but passed through to do_sqlite_exec(). Note also how <em>filename<\/em> is not quoted, it does not need to be; it may look strange though and if this is insufferable just edit the REs on lines 501 and 502 of the extension to add them. Beware that they&#8217;ll be enclosed between an already quoted string so some escaping might be necessary to placate gawk&#8217;s parsing.<\/p>\n<p>Lines 201 to 214: We are testing the on-screen formatting of the columns when a large chunk of text from a blob is displayed. We noticed that if the text contains line feeds, it corrupts the display; this why the replace() SQL function is used on line 210 to change it to the literal string &#8220;\\n&#8221;, which won&#8217;t be interpreted by the terminal.<\/p>\n<p>Line 216: The database is closed.<\/p>\n<h3>A stress test<\/h3>\n<p>How about stressing SQLite and it gawk interface a little bit ? Here is a very simple script that recursively scans a given file directory up to a predefined depth and inserts the file entries, a few metadata and content, into a table.<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1; highlight: [14]\">\n# stress test program for the sqlite_gawk, interface to sqlite3;\n# Cesare Cervini\n# dbi-services.com\n# 8\/2018\n\n# the test consists in scanning a disk directory for specifed files and loading into a sqlite db table all the files's metadata and content;\n# the database stress-test.db must have been created beforehand;\n# the used table will be created by the test;\n# Usage:\n#   AWKLIBPATH=gawk-4.2.1\/extension\/.libs find ~\/ -type f -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk-4.2.1\/gawk -f stress_sqlite.awk\n# or (if if AWKLIBPATH is defined appropriately or the SQLite extension has been deployed on system-wide):\n#   find ~\/ -type f -L -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk -f stress_sqlite.awk\n# for testing:\n#    max_depth=5; limit=10; AWKLIBPATH=gawk-4.2.1\/extension\/.libs find ~\/ -maxdepth $max_depth -type f -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk -v limit=$limit -f stress_sqlite.awk\n#\n\n@load \"sqlite_gawk\"\n\nBEGIN {\n   my_db = sqlite_open(\"\/home\/dmadmin\/dmgawk\/stress-test.db\")\n   print \"my_db =\", my_db, \"after db opened\"\n\n   rc = sqlite_select(my_db, \"SELECT sqlite_version()\", 20)\n   printf \"n\"\n\n   rc = sqlite_exec(my_db, \"DROP TABLE IF EXISTS stress\")\n   rc = sqlite_exec(my_db, \"vacuum\")\n\n   rc = sqlite_exec(my_db, \"CREATE TABLE stress(\" \n                                                \"file_type CHAR,\"     \n                                                \"permissions TEXT,\"   \n                                                \"owner_name TEXT,\"    \n                                                \"group_name TEXT,\"    \n                                                \"file_size NUMBER,\"   \n                                                \"file_date DATE,\"     \n                                                \"file_name TEXT,\"     \n                                                \"content_type TEXT,\"  \n                                                \"content BLOB)\")\n   print \"return code=\", rc, \"after table creation\"\n   printf \"n\"\n}\n{\n   file_type = substr($1, 1, 1)\n   permissions = substr($1, 2)\n   owner_name = $2\n   group_name = $3\n   file_size = $4\n   file_date = $5 \" \" $6\n   file_name = $7\n   cmd = \"file -b \" $7; cmd | getline content_type; close(cmd)\n  \n   stmt = sprintf(\"INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"%s\", \"%s\", \"%s\", \"%s\", \"%s\", \"%s\", \"%s\", \"%s\", %s)\",\n                                      file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, \"readfile('\" file_name \"')\")\n   print stmt\n   rc = sqlite_exec(my_db, stmt)\n   printf(\"return code=%d after insertionnn\", rc)\n\n   if (NR &gt;= limit)\n      exit\n}\nEND {\n   printf \"n\"\n   rc = sqlite_select(my_db, \"SELECT file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, CASE WHEN instr(content_type, 'ASCII text') &gt; 0 THEN \" \n                             \"CASE WHEN length(content) &gt; 200 THEN replace(replace(replace(substr(content, 1, 200) || '...', 'n', '\\n'), 't', '\\t'), 'r', '\\r') ELSE replace(replace(replace(content, 'n', '\\n'), 't', '\\t'), 'r', '\\r') END ELSE substr(hex(content), 1, 200) || '...' END as text FROM stress\", 0, my_array)\n   print \"return code = \", rc, \"after select into array\"\n   dumparray(\"my_array\", my_array)\n\n   printf \"n\"\n   rc = sqlite_select(my_db, \"SELECT file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, CASE WHEN instr(content_type, 'ASCII text') &gt; 0 THEN \" \n                             \"CASE WHEN length(content) &gt; 200 THEN replace(replace(replace(substr(content, 1, 200) || '...', 'n', '\\n'), 't', '\\t'), 'r', '\\r') ELSE replace(replace(replace(content, 'n', '\\n'), 't', '\\t'), 'r', '\\r') END ELSE substr(hex(content), 1, 200) || '...' END as text FROM stress\", \"9 11 10 10 9 16 30w 12w 50w\")\n   print \"return code = \", rc, \"after select\"\n\n   printf \"n\"\n   rc = sqlite_close(my_db)\n   print \"return code=\", rc, \"after closing database\"\n}\n\nfunction dumparray(name, array, i) {\n   for (i in array)\n      if (isarray(array[i]))\n         dumparray(name \"[\"\" i \"\"]\", array[i])\n      else\n         printf(\"%s[\"%s\"] = %sn\", name, i, array[i])\n}\n<\/pre>\n<p>One line 23, the version of SQLite is printed; it should be 3.24.0.<br \/>\nOn line 28, the stress table is created with a suitable structure to accept the files&#8217; metadata of interest as output by the &#8220;ls -l&#8221; command, e.g.:<br \/>\n<code><br \/>\n-r-xr-xr-x 1 dmadmin dmadmin 1641005 2017-01-26 05:14 \/home\/dmadmin\/oracle\/instantclient_12_2\/libsqlplusic.so<br \/>\n<\/code><br \/>\nLines 43 to 50 extract into variables the metadata from each line of input to gawk from the find command. This intermediary step has been introduced for clarity as $* variables could directly be used on line 52.<br \/>\nLine 52 builds the INSERT SQL statement that gets passed to line 55 for execution.<br \/>\nLine 58 checks whether the file count limit has been reached and aborts the loop is so.<br \/>\nFinally, line 63 extracts the content of the stress table into an array which is dumped on line 66 and line 69 displays it in such a way that the output is not too messy.<\/p>\n<p>Here is an example of the result:<br \/>\n<code><br \/>\nmax_depth=5; limit=20; find ~\/ -maxdepth $max_depth -type f -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk -v limit=$limit -f stress_sqlite.awk<br \/>\nmy_db = 0 after db opened<br \/>\nsqlite_version()<br \/>\n--------------------<br \/>\n3.24.0<br \/>\n1 rows selected<\/p>\n<p>return code= 0 after table creation<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"2009\", \"2017-11-17 18:16\", \"\/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.err\", \"ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.err'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"1309\", \"2017-11-17 19:25\", \"\/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.out\", \"ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.out'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"2973490\", \"2017-11-17 19:29\", \"\/home\/dmadmin\/oraInventory\/logs\/installActions2017-11-17_06-11-23PM.log\", \"ASCII text, with very long lines\", readfile('\/home\/dmadmin\/oraInventory\/logs\/installActions2017-11-17_06-11-23PM.log'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"292\", \"2017-11-17 19:26\", \"\/home\/dmadmin\/oraInventory\/ContentsXML\/libs.xml\", \"XML 1.0 document, ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/ContentsXML\/libs.xml'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"468\", \"2017-11-17 19:25\", \"\/home\/dmadmin\/oraInventory\/ContentsXML\/inventory.xml\", \"XML 1.0 document, ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/ContentsXML\/inventory.xml'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"329\", \"2017-11-17 19:26\", \"\/home\/dmadmin\/oraInventory\/ContentsXML\/comps.xml\", \"XML 1.0 document, ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/ContentsXML\/comps.xml'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"60\", \"2017-11-17 19:26\", \"\/home\/dmadmin\/oraInventory\/oraInst.loc\", \"ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/oraInst.loc'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwx---\", \"dmadmin\", \"dmadmin\", \"1661\", \"2017-11-17 19:26\", \"\/home\/dmadmin\/oraInventory\/orainstRoot.sh\", \"POSIX shell script, ASCII text executable\", readfile('\/home\/dmadmin\/oraInventory\/orainstRoot.sh'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw----\", \"dmadmin\", \"dmadmin\", \"362\", \"2017-11-17 18:16\", \"\/home\/dmadmin\/oraInventory\/oui\/srcs.lst\", \"XML 1.0 document, ASCII text\", readfile('\/home\/dmadmin\/oraInventory\/oui\/srcs.lst'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-------\", \"dmadmin\", \"dmadmin\", \"4747\", \"2018-09-01 13:59\", \"\/home\/dmadmin\/.sqlite_history\", \"assembler source, ASCII text\", readfile('\/home\/dmadmin\/.sqlite_history'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw-r--\", \"dmadmin\", \"dmadmin\", \"176\", \"2018-08-08 22:49\", \"\/home\/dmadmin\/mp-fact.awk\", \"ASCII text\", readfile('\/home\/dmadmin\/mp-fact.awk'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"24951775\", \"2016-11-04 12:13\", \"\/home\/dmadmin\/setup_files\/dfcUnix.zip\", \"Zip archive data, at least v1.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/dfcUnix.zip'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"2179169\", \"2016-11-04 12:24\", \"\/home\/dmadmin\/setup_files\/serviceWrapperManager.jar\", \"Java archive data (JAR)\", readfile('\/home\/dmadmin\/setup_files\/serviceWrapperManager.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"53170976\", \"2005-03-23 15:29\", \"\/home\/dmadmin\/setup_files\/53\/dfcSetup.jar\", \"Zip archive data, at least v2.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/53\/dfcSetup.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"129883\", \"2005-03-20 08:29\", \"\/home\/dmadmin\/setup_files\/53\/consistency_checker.ebs\", \"ASCII text, with CRLF line terminators\", readfile('\/home\/dmadmin\/setup_files\/53\/consistency_checker.ebs'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rw-rw-r--\", \"dmadmin\", \"dmadmin\", \"803\", \"2017-11-28 09:44\", \"\/home\/dmadmin\/setup_files\/53\/init53\", \"ASCII text\", readfile('\/home\/dmadmin\/setup_files\/53\/init53'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"174586519\", \"2005-03-23 20:16\", \"\/home\/dmadmin\/setup_files\/53\/server.jar\", \"Zip archive data, at least v2.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/53\/server.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"9843412\", \"2004-12-03 01:25\", \"\/home\/dmadmin\/setup_files\/53\/tomcat4127Setup.jar\", \"Zip archive data, at least v2.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/53\/tomcat4127Setup.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"5587942\", \"2005-03-23 15:27\", \"\/home\/dmadmin\/setup_files\/53\/bofcollaborationSetup.jar\", \"Zip archive data, at least v2.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/53\/bofcollaborationSetup.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>INSERT INTO stress(file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, content) VALUES(\"-\", \"rwxrwxr-x\", \"dmadmin\", \"dmadmin\", \"5900841\", \"2005-03-23 15:27\", \"\/home\/dmadmin\/setup_files\/53\/bofworkflowSetup.jar\", \"Zip archive data, at least v2.0 to extract\", readfile('\/home\/dmadmin\/setup_files\/53\/bofworkflowSetup.jar'))<br \/>\nreturn code=0 after insertion<\/p>\n<p>20 rows selected<br \/>\nreturn code =  0 after select into array<br \/>\nmy_array[\"0\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"0\"][\"file_type\"] = -<br \/>\nmy_array[\"0\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"0\"][\"permissions\"] = rw-rw----<br \/>\nmy_array[\"0\"][\"text\"] = java.lang.NullPointerExceptionntat oracle.sysman.oii.oiin.OiinNetOps.addNICInfo(OiinNetOps.java:144)ntat oracle.sysman.oii.oiin.OiinNetOps.computeNICList(OiinNetOps.java:109)ntat oracle.sysman.oii.oii...<br \/>\nmy_array[\"0\"][\"file_name\"] = \/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.err<br \/>\nmy_array[\"0\"][\"file_size\"] = 2009<br \/>\nmy_array[\"0\"][\"file_date\"] = 2017-11-17 18:16<br \/>\nmy_array[\"0\"][\"content_type\"] = ASCII text<br \/>\nmy_array[\"1\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"1\"][\"file_type\"] = -<br \/>\nmy_array[\"1\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"1\"][\"permissions\"] = rw-rw----<br \/>\nmy_array[\"1\"][\"text\"] = You can find the log of this install session at:n \/home\/dmadmin\/oraInventory\/logs\/installActions2017-11-17_06-11-23PM.lognError in invoking target 'install' of makefile '\/home\/dmadmin\/oracle\/database\/...<br \/>\nmy_array[\"1\"][\"file_name\"] = \/home\/dmadmin\/oraInventory\/logs\/oraInstall2017-11-17_06-11-23PM.out<br \/>\nmy_array[\"1\"][\"file_size\"] = 1309<br \/>\nmy_array[\"1\"][\"file_date\"] = 2017-11-17 19:25<br \/>\nmy_array[\"1\"][\"content_type\"] = ASCII text<br \/>\nmy_array[\"2\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"2\"][\"file_type\"] = -<br \/>\nmy_array[\"2\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"2\"][\"permissions\"] = rw-rw----<br \/>\nmy_array[\"2\"][\"text\"] = INFO: Using paramFile: \/home\/dmadmin\/Downloads\/database\/install\/oraparam.ininINFO: nINFO: nINFO: Checking Temp space: must be greater than 500 MB.   Actual 75674 MB    PassednINFO: Checking swap space...<br \/>\nmy_array[\"2\"][\"file_name\"] = \/home\/dmadmin\/oraInventory\/logs\/installActions2017-11-17_06-11-23PM.log<br \/>\nmy_array[\"2\"][\"file_size\"] = 2973490<br \/>\nmy_array[\"2\"][\"file_date\"] = 2017-11-17 19:29<br \/>\nmy_array[\"2\"][\"content_type\"] = ASCII text, with very long lines<br \/>\nmy_array[\"3\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"3\"][\"file_type\"] = -<br \/>\nmy_array[\"3\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"3\"][\"permissions\"] = rw-rw----<br \/>\nmy_array[\"3\"][\"text\"] = n<!-- Copyright (c) 1999, 2017, Oracle and\/or its affiliates.nAll rights reserved. -->n<!-- Do not modify the contents of this file by hand. -->nn&lt;QUER...<br \/>\nmy_array[&quot;3&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/oraInventory\/ContentsXML\/libs.xml<br \/>\nmy_array[&quot;3&quot;][&quot;file_size&quot;] = 292<br \/>\nmy_array[&quot;3&quot;][&quot;file_date&quot;] = 2017-11-17 19:26<br \/>\nmy_array[&quot;3&quot;][&quot;content_type&quot;] = XML 1.0 document, ASCII text<br \/>\nmy_array[&quot;4&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;4&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;4&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;4&quot;][&quot;permissions&quot;] = rw-rw----<br \/>\nmy_array[&quot;4&quot;][&quot;text&quot;] = n<!-- Copyright (c) 1999, 2017, Oracle and\/or its affiliates.nAll rights reserved. -->n<!-- Do not modify the contents of this file by hand. -->nn&lt;VER...<br \/>\nmy_array[&quot;4&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/oraInventory\/ContentsXML\/inventory.xml<br \/>\nmy_array[&quot;4&quot;][&quot;file_size&quot;] = 468<br \/>\nmy_array[&quot;4&quot;][&quot;file_date&quot;] = 2017-11-17 19:25<br \/>\nmy_array[&quot;4&quot;][&quot;content_type&quot;] = XML 1.0 document, ASCII text<br \/>\nmy_array[&quot;5&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;5&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;5&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;5&quot;][&quot;permissions&quot;] = rw-rw----<br \/>\nmy_array[&quot;5&quot;][&quot;text&quot;] = n<!-- Copyright (c) 1999, 2017, Oracle and\/or its affiliates.nAll rights reserved. -->n<!-- Do not modify the contents of this file by hand. -->nn&lt;TL_L...<br \/>\nmy_array[&quot;5&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/oraInventory\/ContentsXML\/comps.xml<br \/>\nmy_array[&quot;5&quot;][&quot;file_size&quot;] = 329<br \/>\nmy_array[&quot;5&quot;][&quot;file_date&quot;] = 2017-11-17 19:26<br \/>\nmy_array[&quot;5&quot;][&quot;content_type&quot;] = XML 1.0 document, ASCII text<br \/>\nmy_array[&quot;6&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;6&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;6&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;6&quot;][&quot;permissions&quot;] = rw-rw----<br \/>\nmy_array[&quot;6&quot;][&quot;text&quot;] = inventory_loc=\/home\/dmadmin\/oraInventoryninst_group=dmadminn<br \/>\nmy_array[&quot;6&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/oraInventory\/oraInst.loc<br \/>\nmy_array[&quot;6&quot;][&quot;file_size&quot;] = 60<br \/>\nmy_array[&quot;6&quot;][&quot;file_date&quot;] = 2017-11-17 19:26<br \/>\nmy_array[&quot;6&quot;][&quot;content_type&quot;] = ASCII text<br \/>\nmy_array[&quot;7&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;7&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;7&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;7&quot;][&quot;permissions&quot;] = rwxrwx---<br \/>\nmy_array[&quot;7&quot;][&quot;text&quot;] = #!\/bin\/shnAWK=\/bin\/awknCHMOD=\/bin\/chmodnCHGRP=\/bin\/chgrpnCP=\/bin\/cpnECHO=\/bin\/echonMKDIR=\/bin\/mkdirnRUID=`\/usr\/bin\/id|$AWK -F( &#039;{print $1}&#039;|$AWK -F= &#039;{print $2}&#039;`nif [ ${RUID} != &quot;0&quot; ];thenn   $ECHO...<br \/>\nmy_array[&quot;7&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/oraInventory\/orainstRoot.sh<br \/>\nmy_array[&quot;7&quot;][&quot;file_size&quot;] = 1661<br \/>\nmy_array[&quot;7&quot;][&quot;file_date&quot;] = 2017-11-17 19:26<br \/>\nmy_array[&quot;7&quot;][&quot;content_type&quot;] = POSIX shell script, ASCII text executable<br \/>\nmy_array[&quot;8&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;8&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;8&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;8&quot;][&quot;permissions&quot;] = rw-rw----<br \/>\nmy_array[&quot;8&quot;][&quot;text&quot;] = n<!-- Copyright (c) 1999, 2017, Oracle and\/or its affiliates.nAll rights reserved. -->n<!-- Do not modify the contents of this file by hand. -->nn...<br \/>\nmy_array[\"8\"][\"file_name\"] = \/home\/dmadmin\/oraInventory\/oui\/srcs.lst<br \/>\nmy_array[\"8\"][\"file_size\"] = 362<br \/>\nmy_array[\"8\"][\"file_date\"] = 2017-11-17 18:16<br \/>\nmy_array[\"8\"][\"content_type\"] = XML 1.0 document, ASCII text<br \/>\nmy_array[\"9\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"9\"][\"file_type\"] = -<br \/>\nmy_array[\"9\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"9\"][\"permissions\"] = rw-------<br \/>\nmy_array[\"9\"][\"text\"] = .helpn.helpn.exitn.help createn.tablesncreate table test(a1 number)ngon.tablesnselect * from testn\/nselect * from testngon.save test.dbn.exitn.open stress-test.dbn.hn.healpn.helpn.scheman.schema %n.sc...<br \/>\nmy_array[\"9\"][\"file_name\"] = \/home\/dmadmin\/.sqlite_history<br \/>\nmy_array[\"9\"][\"file_size\"] = 4747<br \/>\nmy_array[\"9\"][\"file_date\"] = 2018-09-01 13:59<br \/>\nmy_array[\"9\"][\"content_type\"] = assembler source, ASCII text<br \/>\nmy_array[\"10\"][\"owner_name\"] = dmadmin<br \/>\nmy_array[\"10\"][\"file_type\"] = -<br \/>\nmy_array[\"10\"][\"group_name\"] = dmadmin<br \/>\nmy_array[\"10\"][\"permissions\"] = rw-rw-r--<br \/>\nmy_array[\"10\"][\"text\"] = echo 10000 | gawk -M '{facto = factorial($0); printf(\"%dn%d digitsn\", facto, length(facto))} function factorial(n) {r = 1; for (i = 1; i &lt;= n; i++) r *= i; return r}&#039; | lessn<br \/>\nmy_array[&quot;10&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/mp-fact.awk<br \/>\nmy_array[&quot;10&quot;][&quot;file_size&quot;] = 176<br \/>\nmy_array[&quot;10&quot;][&quot;file_date&quot;] = 2018-08-08 22:49<br \/>\nmy_array[&quot;10&quot;][&quot;content_type&quot;] = ASCII text<br \/>\nmy_array[&quot;11&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;11&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;11&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;11&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;11&quot;][&quot;text&quot;] = 504B03040A0000000000A9596449000000000000000000000000040000006466632F504B03040A000000000071863D490000000000000000000000000D0000006466632F6C6963656E7365732F504B03040A000000000071863D49000000000000000000...<br \/>\nmy_array[&quot;11&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/dfcUnix.zip<br \/>\nmy_array[&quot;11&quot;][&quot;file_size&quot;] = 24951775<br \/>\nmy_array[&quot;11&quot;][&quot;file_date&quot;] = 2016-11-04 12:13<br \/>\nmy_array[&quot;11&quot;][&quot;content_type&quot;] = Zip archive data, at least v1.0 to extract<br \/>\nmy_array[&quot;12&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;12&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;12&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;12&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;12&quot;][&quot;text&quot;] = 504B03040A0000000000785BD842000000000000000000000000090004004D4554412D494E462FFECA0000504B03040A0000000800775BD8428241A6B65C00000066000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D...<br \/>\nmy_array[&quot;12&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/serviceWrapperManager.jar<br \/>\nmy_array[&quot;12&quot;][&quot;file_size&quot;] = 2179169<br \/>\nmy_array[&quot;12&quot;][&quot;file_date&quot;] = 2016-11-04 12:24<br \/>\nmy_array[&quot;12&quot;][&quot;content_type&quot;] = Java archive data (JAR)<br \/>\nmy_array[&quot;13&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;13&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;13&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;13&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;13&quot;][&quot;text&quot;] = 504B030414000800080036337732000000000000000000000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5...<br \/>\nmy_array[&quot;13&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/dfcSetup.jar<br \/>\nmy_array[&quot;13&quot;][&quot;file_size&quot;] = 53170976<br \/>\nmy_array[&quot;13&quot;][&quot;file_date&quot;] = 2005-03-23 15:29<br \/>\nmy_array[&quot;13&quot;][&quot;content_type&quot;] = Zip archive data, at least v2.0 to extract<br \/>\nmy_array[&quot;14&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;14&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;14&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;14&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;14&quot;][&quot;text&quot;] = &#039;####################################################################rn&#039;rn&#039; Name:rn&#039;rn&#039;   consistency_checker.ebsrn&#039;rn&#039; Description:rn&#039;rn&#039;   Docbase Consistency Checker Job and Utilitiesrn&#039;rn&#039;   NOTE:...<br \/>\nmy_array[&quot;14&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/consistency_checker.ebs<br \/>\nmy_array[&quot;14&quot;][&quot;file_size&quot;] = 129883<br \/>\nmy_array[&quot;14&quot;][&quot;file_date&quot;] = 2005-03-20 08:29<br \/>\nmy_array[&quot;14&quot;][&quot;content_type&quot;] = ASCII text, with CRLF line terminators<br \/>\nmy_array[&quot;15&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;15&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;15&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;15&quot;][&quot;permissions&quot;] = rw-rw-r--<br \/>\nmy_array[&quot;15&quot;][&quot;text&quot;] = export DOCUMENTUM=\/home\/dmadmin\/documentum53nexport DM_HOME=$DOCUMENTUM\/product\/5.3nexport DOCUMENTUM_SHARED=$DOCUMENTUM\/sharednexport dfcpath=$DOCUMENTUM_SHARED\/dfcnn#export JAVA_HOME=$DOCUMENTUM_SHA...<br \/>\nmy_array[&quot;15&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/init53<br \/>\nmy_array[&quot;15&quot;][&quot;file_size&quot;] = 803<br \/>\nmy_array[&quot;15&quot;][&quot;file_date&quot;] = 2017-11-28 09:44<br \/>\nmy_array[&quot;15&quot;][&quot;content_type&quot;] = ASCII text<br \/>\nmy_array[&quot;16&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;16&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;16&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;16&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;16&quot;][&quot;text&quot;] = 504B0304140008000800BB597732000000000000000000000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5...<br \/>\nmy_array[&quot;16&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/server.jar<br \/>\nmy_array[&quot;16&quot;][&quot;file_size&quot;] = 174586519<br \/>\nmy_array[&quot;16&quot;][&quot;file_date&quot;] = 2005-03-23 20:16<br \/>\nmy_array[&quot;16&quot;][&quot;content_type&quot;] = Zip archive data, at least v2.0 to extract<br \/>\nmy_array[&quot;17&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;17&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;17&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;17&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;17&quot;][&quot;text&quot;] = 504B030414000800080031838231000000000000000000000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5...<br \/>\nmy_array[&quot;17&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/tomcat4127Setup.jar<br \/>\nmy_array[&quot;17&quot;][&quot;file_size&quot;] = 9843412<br \/>\nmy_array[&quot;17&quot;][&quot;file_date&quot;] = 2004-12-03 01:25<br \/>\nmy_array[&quot;17&quot;][&quot;content_type&quot;] = Zip archive data, at least v2.0 to extract<br \/>\nmy_array[&quot;18&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;18&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;18&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;18&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;18&quot;][&quot;text&quot;] = 504B0304140008000800E3317732000000000000000000000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5...<br \/>\nmy_array[&quot;18&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/bofcollaborationSetup.jar<br \/>\nmy_array[&quot;18&quot;][&quot;file_size&quot;] = 5587942<br \/>\nmy_array[&quot;18&quot;][&quot;file_date&quot;] = 2005-03-23 15:27<br \/>\nmy_array[&quot;18&quot;][&quot;content_type&quot;] = Zip archive data, at least v2.0 to extract<br \/>\nmy_array[&quot;19&quot;][&quot;owner_name&quot;] = dmadmin<br \/>\nmy_array[&quot;19&quot;][&quot;file_type&quot;] = -<br \/>\nmy_array[&quot;19&quot;][&quot;group_name&quot;] = dmadmin<br \/>\nmy_array[&quot;19&quot;][&quot;permissions&quot;] = rwxrwxr-x<br \/>\nmy_array[&quot;19&quot;][&quot;text&quot;] = 504B0304140008000800F5317732000000000000000000000000140000004D4554412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5...<br \/>\nmy_array[&quot;19&quot;][&quot;file_name&quot;] = \/home\/dmadmin\/setup_files\/53\/bofworkflowSetup.jar<br \/>\nmy_array[&quot;19&quot;][&quot;file_size&quot;] = 5900841<br \/>\nmy_array[&quot;19&quot;][&quot;file_date&quot;] = 2005-03-23 15:27<br \/>\nmy_array[&quot;19&quot;][&quot;content_type&quot;] = Zip archive data, at least v2.0 to extract<\/p>\n<p>file_type  permissions  owner_name  group_name  file_size  file_date         file_name                       content_type  text<br \/>\n---------  -----------  ----------  ----------  ---------  ----------------  ------------------------------  ------------  --------------------------------------------------<br \/>\n-          rw-rw----    dmadmin     dmadmin     2009       2017-11-17 18:16  \/home\/dmadmin\/oraInventory\/log  ASCII text    java.lang.NullPointerExceptionntat oracle.sysman<br \/>\n                                                                             s\/oraInstall2017-11-17_06-11-2                .oii.oiin.OiinNetOps.addNICInfo(OiinNetOps.java:14<br \/>\n                                                                             3PM.err                                       4)ntat oracle.sysman.oii.oiin.OiinNetOps.compute<br \/>\n                                                                                                                           NICList(OiinNetOps.java:109)ntat oracle.sysman.o<br \/>\n                                                                                                                           ii.oii...<br \/>\n-          rw-rw----    dmadmin     dmadmin     1309       2017-11-17 19:25  \/home\/dmadmin\/oraInventory\/log  ASCII text    You can find the log of this install session at:n<br \/>\n                                                                             s\/oraInstall2017-11-17_06-11-2                 \/home\/dmadmin\/oraInventory\/logs\/installActions201<br \/>\n                                                                             3PM.out                                       7-11-17_06-11-23PM.lognError in invoking target &#039;<br \/>\n                                                                                                                           install&#039; of makefile &#039;\/home\/dmadmin\/oracle\/databas<br \/>\n                                                                                                                           e\/...<br \/>\n-          rw-rw----    dmadmin     dmadmin     2973490    2017-11-17 19:29  \/home\/dmadmin\/oraInventory\/log  ASCII text,   INFO: Using paramFile: \/home\/dmadmin\/Downloads\/dat<br \/>\n                                                                             s\/installActions2017-11-17_06-  with very lo  abase\/install\/oraparam.ininINFO: nINFO: nINFO:<br \/>\n                                                                             11-23PM.log                     ng lines      Checking Temp space: must be greater than 500 MB.<br \/>\n                                                                                                                             Actual 75674 MB    PassednINFO: Checking swap s<br \/>\n                                                                                                                           pace...<br \/>\n-          rw-rw----    dmadmin     dmadmin     292        2017-11-17 19:26  \/home\/dmadmin\/oraInventory\/Con  XML 1.0 docu  n<!-- Copy\n                                                                             tentsXML\/libs.xml               ment, ASCII   right (c) 1999, 2017, Oracle and\/or its affiliates\n                                                                                                             text          .nAll rights reserved. -->n<!-- Do not modify th\n                                                                                                                           e contents of this file by hand. -->nn<br \/>\n                                                                                                                           &lt;QUER...<br \/>\n-          rw-rw----    dmadmin     dmadmin     468        2017-11-17 19:25  \/home\/dmadmin\/oraInventory\/Con  XML 1.0 docu  n<!-- Copy\n                                                                             tentsXML\/inventory.xml          ment, ASCII   right (c) 1999, 2017, Oracle and\/or its affiliates\n                                                                                                             text          .nAll rights reserved. -->n<!-- Do not modify th\n                                                                                                                           e contents of this file by hand. -->n<br \/>\n                                                                                                                           n&lt;VER...<br \/>\n-          rw-rw----    dmadmin     dmadmin     329        2017-11-17 19:26  \/home\/dmadmin\/oraInventory\/Con  XML 1.0 docu  n<!-- Copy\n                                                                             tentsXML\/comps.xml              ment, ASCII   right (c) 1999, 2017, Oracle and\/or its affiliates\n                                                                                                             text          .nAll rights reserved. -->n<!-- Do not modify th\n                                                                                                                           e contents of this file by hand. -->nn<br \/>\n                                                                                                                           &lt;TL_L...<br \/>\n-          rw-rw----    dmadmin     dmadmin     60         2017-11-17 19:26  \/home\/dmadmin\/oraInventory\/ora  ASCII text    inventory_loc=\/home\/dmadmin\/oraInventoryninst_gro<br \/>\n                                                                             Inst.loc                                      up=dmadminn<br \/>\n-          rwxrwx---    dmadmin     dmadmin     1661       2017-11-17 19:26  \/home\/dmadmin\/oraInventory\/ora  POSIX shell   #!\/bin\/shnAWK=\/bin\/awknCHMOD=\/bin\/chmodnCHGRP=\/<br \/>\n                                                                             instRoot.sh                     script, ASCI  bin\/chgrpnCP=\/bin\/cpnECHO=\/bin\/echonMKDIR=\/bin\/<br \/>\n                                                                                                             I text execu  mkdirnRUID=`\/usr\/bin\/id|$AWK -F( &#039;{print $1}&#039;|$A<br \/>\n                                                                                                             table         WK -F= &#039;{print $2}&#039;`nif [ ${RUID} != &quot;0&quot; ];then<br \/>\n                                                                                                                           n   $ECHO...<br \/>\n-          rw-rw----    dmadmin     dmadmin     362        2017-11-17 18:16  \/home\/dmadmin\/oraInventory\/oui  XML 1.0 docu  n<!-- Copy\n                                                                             \/srcs.lst                       ment, ASCII   right (c) 1999, 2017, Oracle and\/or its affiliates\n                                                                                                             text          .nAll rights reserved. -->n<!-- Do not modify th\n                                                                                                                           e contents of this file by hand. -->nn...<br \/>\n-          rw-------    dmadmin     dmadmin     4747       2018-09-01 13:59  \/home\/dmadmin\/.sqlite_history   assembler so  .helpn.helpn.exitn.help createn.tablesncreate<br \/>\n                                                                                                             urce, ASCII    table test(a1 number)ngon.tablesnselect * from<br \/>\n                                                                                                             text           testn\/nselect * from testngon.save test.dbn.<br \/>\n                                                                                                                           exitn.open stress-test.dbn.hn.healpn.helpn.sc<br \/>\n                                                                                                                           heman.schema %n.sc...<br \/>\n-          rw-rw-r--    dmadmin     dmadmin     176        2018-08-08 22:49  \/home\/dmadmin\/mp-fact.awk       ASCII text    echo 10000 | gawk -M '{facto = factorial($0); prin<br \/>\n                                                                                                                           tf(\"%dn%d digitsn\", facto, length(facto))} funct<br \/>\n                                                                                                                           ion factorial(n) {r = 1; for (i = 1; i &lt;= n; i++)<br \/>\n                                                                                                                           r *= i; return r}&#039; | lessn<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     24951775   2016-11-04 12:13  \/home\/dmadmin\/setup_files\/dfcU  Zip archive   504B03040A0000000000A95964490000000000000000000000<br \/>\n                                                                             nix.zip                         data, at lea  00040000006466632F504B03040A000000000071863D490000<br \/>\n                                                                                                             st v1.0 to e  000000000000000000000D0000006466632F6C6963656E7365<br \/>\n                                                                                                             xtract        732F504B03040A000000000071863D49000000000000000000<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     2179169    2016-11-04 12:24  \/home\/dmadmin\/setup_files\/serv  Java archive  504B03040A0000000000785BD8420000000000000000000000<br \/>\n                                                                             iceWrapperManager.jar            data (JAR)   00090004004D4554412D494E462FFECA0000504B03040A0000<br \/>\n                                                                                                                           000800775BD8428241A6B65C00000066000000140000004D45<br \/>\n                                                                                                                           54412D494E462F4D414E49464553542E4D46F34DCCCB4C4B2D<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     53170976   2005-03-23 15:29  \/home\/dmadmin\/setup_files\/53\/d  Zip archive   504B0304140008000800363377320000000000000000000000<br \/>\n                                                                             fcSetup.jar                     data, at lea  00140000004D4554412D494E462F4D414E49464553542E4D46<br \/>\n                                                                                                             st v2.0 to e  F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E<br \/>\n                                                                                                             xtract        4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     129883     2005-03-20 08:29  \/home\/dmadmin\/setup_files\/53\/c  ASCII text,   &#039;#################################################<br \/>\n                                                                             onsistency_checker.ebs          with CRLF li  ###################rn&#039;rn&#039; Name:rn&#039;rn&#039;   co<br \/>\n                                                                                                             ne terminato  nsistency_checker.ebsrn&#039;rn&#039; Description:rn&#039;<br \/>\n                                                                                                             rs            rn&#039;   Docbase Consistency Checker Job and Utiliti<br \/>\n                                                                                                                           esrn&#039;rn&#039;   NOTE:...<br \/>\n-          rw-rw-r--    dmadmin     dmadmin     803        2017-11-28 09:44  \/home\/dmadmin\/setup_files\/53\/i  ASCII text    export DOCUMENTUM=\/home\/dmadmin\/documentum53nexpo<br \/>\n                                                                             nit53                                         rt DM_HOME=$DOCUMENTUM\/product\/5.3nexport DOCUMEN<br \/>\n                                                                                                                           TUM_SHARED=$DOCUMENTUM\/sharednexport dfcpath=$DOC<br \/>\n                                                                                                                           UMENTUM_SHARED\/dfcnn#export JAVA_HOME=$DOCUMENTU<br \/>\n                                                                                                                           M_SHA...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     174586519  2005-03-23 20:16  \/home\/dmadmin\/setup_files\/53\/s  Zip archive   504B0304140008000800BB5977320000000000000000000000<br \/>\n                                                                             erver.jar                       data, at lea  00140000004D4554412D494E462F4D414E49464553542E4D46<br \/>\n                                                                                                             st v2.0 to e  F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E<br \/>\n                                                                                                             xtract        4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     9843412    2004-12-03 01:25  \/home\/dmadmin\/setup_files\/53\/t  Zip archive   504B0304140008000800318382310000000000000000000000<br \/>\n                                                                             omcat4127Setup.jar              data, at lea  00140000004D4554412D494E462F4D414E49464553542E4D46<br \/>\n                                                                                                             st v2.0 to e  F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E<br \/>\n                                                                                                             xtract        4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     5587942    2005-03-23 15:27  \/home\/dmadmin\/setup_files\/53\/b  Zip archive   504B0304140008000800E33177320000000000000000000000<br \/>\n                                                                             ofcollaborationSetup.jar        data, at lea  00140000004D4554412D494E462F4D414E49464553542E4D46<br \/>\n                                                                                                             st v2.0 to e  F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E<br \/>\n                                                                                                             xtract        4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5<br \/>\n                                                                                                                           ...<br \/>\n-          rwxrwxr-x    dmadmin     dmadmin     5900841    2005-03-23 15:27  \/home\/dmadmin\/setup_files\/53\/b  Zip archive   504B0304140008000800F53177320000000000000000000000<br \/>\n                                                                             ofworkflowSetup.jar             data, at lea  00140000004D4554412D494E462F4D414E49464553542E4D46<br \/>\n                                                                                                             st v2.0 to e  F34DCCCB4C4B2D2ED10D4B2D2ACECCCFB35230D433E0E5722E<br \/>\n                                                                                                             xtract        4A4D2C494DD175AAB452F0CC2B2E49CCC909CEC84CCD49E1E5<br \/>\n                                                                                                                           ...<br \/>\n20 rows selected<br \/>\nreturn code =  0 after select<\/p>\n<p>return code= 0 after closing database\n<\/p>\n<p><\/code><br \/>\nIt can be used as follows:<br \/>\n<code><br \/>\nAWKLIBPATH=gawk-4.2.1\/extension\/.libs find <em>root_dir<\/em> -maxdepth <em>max_depth<\/em> -type f -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk -v limit=<em>how_many_files<\/em> -f stress_sqlite.awk<br \/>\n<\/code><br \/>\nExample:<br \/>\n<code><br \/>\nmax_depth=5; limit=50; AWKLIBPATH=gawk-4.2.1\/extension\/.libs find ~\/ -maxdepth $max_depth -type f -printf \"%M %u %g %s %TY-%Tm-%Td %TH:%TM %pn\" | gawk -v limit=$limit -f stress_sqlite.awk<br \/>\n<\/code><br \/>\nNoticed the ugly nested replace() function calls in the SELECT statement ?:<br \/>\n<code><br \/>\n\"SELECT file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, CASE WHEN instr(content_type, 'ASCII text') &gt; 0 THEN \"<br \/>\n                             \"CASE WHEN length(content) &gt; 200 THEN replace(replace(replace(substr(content, 1, 200) || '...', 'n', '\\n'), 't', '\\t'), 'r', '\\r') ELSE replace(replace(replace(content, 'n', '\\n'), 't', '\\t'), 'r', '\\r') END ELSE substr(hex(content), 1, 200) || '...' END as text FROM stress\"<br \/>\n<\/code><br \/>\nThis is because there no SQLite equivalent to the Unix command &#8220;tr set1 set2&#8221; or to Oracle&#8217;s translate() function, but such a function could easily be user-defined. This is left as an exercise for the reader.<br \/>\nThe column widths passed to the last sqlite_select() call came from prior a &#8220;draft&#8221; execution with following call:<br \/>\n<code><br \/>\n   rc = sqlite_select(my_db, \"SELECT file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, CASE content_type WHEN 'ASCII text' THEN \"<br \/>\n                             \"CASE WHEN length(content) &gt; 200 THEN replace(replace(substr(content, 1, 200) || '...', 'n', '\\n'), 't', '\\t') ELSE replace(replace(content, 'n', '\\n'), 't', '\\t') END ELSE '' END as text FROM stress\")<br \/>\n<\/code><br \/>\nAnd its the result was:<br \/>\n<code><br \/>\n...<br \/>\nOptimum column widths<br \/>\n=====================<br \/>\nfor query: SELECT file_type, permissions, owner_name, group_name, file_size, file_date, file_name, content_type, CASE content_type WHEN 'ASCII text' THEN CASE WHEN length(content) &gt; 200 THEN replace(replace(substr(content, 1, 200) || '...', '<br \/>\n', 'n'), '\t', 't') ELSE replace(replace(content, '<br \/>\n', 'n'), '\t', 't') END ELSE '' END as text FROM stress<br \/>\nfile_type          9<br \/>\npermissions        11<br \/>\nowner_name         10<br \/>\ngroup_name         10<br \/>\nfile_size          9<br \/>\nfile_date          16<br \/>\nfile_name          71<br \/>\ncontent_type       42<br \/>\ntext               209<br \/>\n<\/code><br \/>\nSo those values were injected back after some adjustments.<br \/>\nBeware that files can add up pretty quickly, hence the file limit parameter. Ditto for the database file. Here, the 50 files in ~\/dmadmin sum up to almost 2 Gb already so make sure there is enough free space for the database file.<br \/>\nIf the way sqlite_select() works does not appeal to you, despite all the different display formats it offers, you can give a try to sqlitebrowser <a title=\"sqlitebrowser\" href=\"http:\/\/sqlitebrowser.org\/\" target=\"_blank\" rel=\"noopener\">available here<\/a>.<\/p>\n<h3>Bulk updates from an array<\/h3>\n<p>With the sqlite_exec() primitive available, it is possibe to build up a bulk insert, update or delete function that takes its database parameters from a gawk array. Let&#8217;s name it bulk_exec(). Here is a possible implementation:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1; highlight: [1]\">\nfunction bulk_exec(db, op, table_name, larray     , rowid, stmt, columns, values, col) {\n   for (rowid in larray) {\n      if (\"delete\" == op) {\n         stmt = sprintf(\"DELETE FROM test1 WHERE ROWID = %s\", rowid)\n      }\n      else if (\"update\" == op) {\n         columns = \"\"\n         values = \"\"\n         for (col in larray[rowid]) {\n            columns = columns (\"\" == columns ? \"\" : \", \") col\n            values = values (\"\" == values ? \"\" : \", \") \"'\" larray[rowid][col] \"'\"\n         }\n         stmt = sprintf(\"UPDATE test1 SET (%s) = (%s) WHERE ROWID = %s\", columns, values, rowid)\n      }\n      else if (\"insert\" == op) {\n         columns = \"\"\n         values = \"\"\n         for (col in larray[rowid]) {\n            columns = columns (\"\" == columns ? \"\" : \", \") col\n            values = values (\"\" == values ? \"\" : \", \") \"'\" larray[rowid][col] \"'\"\n         }\n         stmt = sprintf(\"INSERT INTO test1 (%s) VALUES (%s)\", columns, values)\n      }\n      rc = sqlite_exec(my_db, stmt)\n      print \"return code = \", rc\n   }\n}\n<\/pre>\n<p>The op test is made here for each row, which is inefficient, but this is just an example.<br \/>\ndb is the database handle;<br \/>\nop is the operation to perform, e.g. one of &#8220;insert&#8221;, &#8220;update&#8221; or &#8220;delete&#8221;;<br \/>\ntable_name is the table to modify;<br \/>\nlarray is the array that contains the tabular data to apply to table_name.<br \/>\nThe structure of the array larray is important. For one thing, it is indexed by rowids so as to univoquely identify the rows to delete or update (obviously not used when inserting rows); this is a difference from the sequence number usually used as an index. The second dimension of the array must contain the column values for updating existing rows or inserting new rows (obviously not used when deleting rows). If the gawk array comes from a preceding call to sqlite_select(), it may be necessary to include the rowid column in the SELECT statement&#8217;s list of columns (see line 2 below) and remap the array to be indexed by that rowid (see lines 9 to 16 below). Here is a code snipped showing this for the 3 bulk operations:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1; highlight: [2,9,11,12,13,16,17,18]\">\n...\n   stmt = \"SELECT rowid, n1, s1, s2 FROM test1\";\n   rc = sqlite_select(my_db, stmt, 0, a_test)\n   delete del_test\n   delete mod_test\n   delete ins_test\n   for (i in a_test)\n      if (0 == i % 4)\n         del_test[a_test[i][\"rowid\"]] = 0\n      else if (1 == i % 4) {\n         mod_test[a_test[i][\"rowid\"]][\"n1\"] = a_test[i][\"n1\"] * 100\n         mod_test[a_test[i][\"rowid\"]][\"s1\"] = a_test[i][\"s1\"] \" - s1 modified\"\n         mod_test[a_test[i][\"rowid\"]][\"s2\"] = a_test[i][\"s2\"] \" - s2 modified\"\n      }\n      else if (2 == i % 4) {\n         ins_test[a_test[i][\"rowid\"]][\"n1\"] = int(1000 * rand()) + 1000\n         ins_test[a_test[i][\"rowid\"]][\"s1\"] = a_test[i][\"s1\"] \" - s1 inserted\"\n         ins_test[a_test[i][\"rowid\"]][\"s2\"] = a_test[i][\"s2\"] \" - s2 inserted\"\n      }\n   dumparray(\"a_test\", a_test)\n   dumparray(\"del_test\", del_test)\n   dumparray(\"mod_test\", mod_test)\n   dumparray(\"ins_test\", ins_test)\n   bulk_exec(my_db, \"update\", \"test1\", mod_test)\n   bulk_exec(my_db, \"delete\", \"test1\", del_test)\n   bulk_exec(my_db, \"insert\", \"test1\", ins_test)\n   dumparray(\"a_test\", a_test)\n...\n<\/pre>\n<p>An alternate implementation could still use the sequential integer index as returned by sqlite_select() and, if needed, extract the rowid column from the array. Here is an illustration for the bulk delete:<\/p>\n<pre class=\"brush: bash; gutter: true; first-line: 1; highlight: []\">\nfunction bulk_exec(db, op, table_name, larray     , rowid, stmt, columns, values, col) {\n   for (i in larray) {\n      if (\"delete\" == op) {\n         stmt = sprintf(\"DELETE FROM test1 WHERE ROWID = %s\", larray[i][\"rowid\"])\n      }\n...\n<\/pre>\n<p>bulk_exec() could have been implemented in the extension C code but the benefits would have been limited.<\/p>\n<h3>Conclusion<\/h3>\n<p>One important remark is that although SQLite supports UTF-8 and UTF-16, no particular attention has been applied here when performing C string operations; C strings are still char * in the extension but it shouldn&#8217;t be too difficult to convert them to wchar_t * for multi-byte characters.<br \/>\nThe extension is quite heavy on blobs because I find them fascinating. However, they may be OK to store that 4 Mb Despacito MP3 song but not so much for that uncompressed 4 Gb Mamma Mia ! Editor&#8217;s Cuts Blu-ray Edition. The fact that it is possible to store anything in a database does not imply that it is a sensible solution to do so. Even Documentum proposes blob stores for faster content retrieval but the content sizes are limited to 64 Kb. The contents are still stored in filesystems because there is no gain to deal with huge (in the Tb or above range) database&#8217;s datafiles (although with data files on raw disks it may be a different story for repositories). It might be simpler to have everything in one place, metadata and contents, but performance, administration, and backup would be much more painful. A more reasonable use of blobs is to store small, frequently used files in-line and larger ones on disk with their full path name saved as metadata in the repository. That&#8217;s what Documentum, and the iPod, do. It is the best alternative given how slow it is to scan a file tree on disk. Scanning inside the database is way faster that in a filesystem, especially the Documentum filestores due to their particular structure (4-level deep with all the files in the last level). Anyway, the feature is here and can turn useful if used appropriately.<br \/>\nThis extension proves again how easy gawk lets itself customize for a broader, almost unexpected usage. So far, we can use it to work with Documentum repositories (dmgawk), work with databases (SQLite and PostgreSQL), process xml (awk\/xml), etc. The fact that all this is free and open-source helps greatly; no wonder this development model has gained such a tremendous momentum.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Welcome to part III of a three-part article on extending gawk with a SQLite binding. Part I is here. Part II is here. In this final part, I&#8217;ll explain succinctly the code that was listed in part II. This part is best read while having Part II opened in another browser window. sqlite_gawk.c Line 48: [&hellip;]<\/p>\n","protected":false},"author":40,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[525],"tags":[],"type_dbi":[],"class_list":["post-11630","post","type-post","status-publish","format-standard","hentry","category-enterprise-content-management"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v27.2 (Yoast SEO v27.2) - https:\/\/yoast.com\/product\/yoast-seo-premium-wordpress\/ -->\n<title>A SQLite extension for gawk (part III) - dbi Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"A SQLite extension for gawk (part III)\" \/>\n<meta property=\"og:description\" content=\"Welcome to part III of a three-part article on extending gawk with a SQLite binding. Part I is here. Part II is here. In this final part, I&#8217;ll explain succinctly the code that was listed in part II. This part is best read while having Part II opened in another browser window. sqlite_gawk.c Line 48: [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\" \/>\n<meta property=\"og:site_name\" content=\"dbi Blog\" \/>\n<meta property=\"article:published_time\" content=\"2018-09-07T08:38:44+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-10-24T07:26:21+00:00\" \/>\n<meta name=\"author\" content=\"Middleware Team\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Middleware Team\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"42 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\"},\"author\":{\"name\":\"Middleware Team\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1\"},\"headline\":\"A SQLite extension for gawk (part III)\",\"datePublished\":\"2018-09-07T08:38:44+00:00\",\"dateModified\":\"2025-10-24T07:26:21+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\"},\"wordCount\":3284,\"commentCount\":0,\"articleSection\":[\"Enterprise content management\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\",\"url\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\",\"name\":\"A SQLite extension for gawk (part III) - dbi Blog\",\"isPartOf\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#website\"},\"datePublished\":\"2018-09-07T08:38:44+00:00\",\"dateModified\":\"2025-10-24T07:26:21+00:00\",\"author\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Accueil\",\"item\":\"https:\/\/www.dbi-services.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"A SQLite extension for gawk (part III)\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#website\",\"url\":\"https:\/\/www.dbi-services.com\/blog\/\",\"name\":\"dbi Blog\",\"description\":\"\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.dbi-services.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1\",\"name\":\"Middleware Team\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g\",\"caption\":\"Middleware Team\"},\"url\":\"https:\/\/www.dbi-services.com\/blog\/author\/middleware-team\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"A SQLite extension for gawk (part III) - dbi Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/","og_locale":"en_US","og_type":"article","og_title":"A SQLite extension for gawk (part III)","og_description":"Welcome to part III of a three-part article on extending gawk with a SQLite binding. Part I is here. Part II is here. In this final part, I&#8217;ll explain succinctly the code that was listed in part II. This part is best read while having Part II opened in another browser window. sqlite_gawk.c Line 48: [&hellip;]","og_url":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/","og_site_name":"dbi Blog","article_published_time":"2018-09-07T08:38:44+00:00","article_modified_time":"2025-10-24T07:26:21+00:00","author":"Middleware Team","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Middleware Team","Est. reading time":"42 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#article","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/"},"author":{"name":"Middleware Team","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1"},"headline":"A SQLite extension for gawk (part III)","datePublished":"2018-09-07T08:38:44+00:00","dateModified":"2025-10-24T07:26:21+00:00","mainEntityOfPage":{"@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/"},"wordCount":3284,"commentCount":0,"articleSection":["Enterprise content management"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/","url":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/","name":"A SQLite extension for gawk (part III) - dbi Blog","isPartOf":{"@id":"https:\/\/www.dbi-services.com\/blog\/#website"},"datePublished":"2018-09-07T08:38:44+00:00","dateModified":"2025-10-24T07:26:21+00:00","author":{"@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1"},"breadcrumb":{"@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.dbi-services.com\/blog\/a-sqlite-extension-for-gawk-part-iii\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Accueil","item":"https:\/\/www.dbi-services.com\/blog\/"},{"@type":"ListItem","position":2,"name":"A SQLite extension for gawk (part III)"}]},{"@type":"WebSite","@id":"https:\/\/www.dbi-services.com\/blog\/#website","url":"https:\/\/www.dbi-services.com\/blog\/","name":"dbi Blog","description":"","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.dbi-services.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/www.dbi-services.com\/blog\/#\/schema\/person\/8d8563acfc6e604cce6507f45bac0ea1","name":"Middleware Team","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/ddcae7ba0f9d1a0e7ae707f0e689e4a9c95bb48ec49c8e6d9cc86d43f4121cb6?s=96&d=mm&r=g","caption":"Middleware Team"},"url":"https:\/\/www.dbi-services.com\/blog\/author\/middleware-team\/"}]}},"_links":{"self":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/11630","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/users\/40"}],"replies":[{"embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/comments?post=11630"}],"version-history":[{"count":1,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/11630\/revisions"}],"predecessor-version":[{"id":41173,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/posts\/11630\/revisions\/41173"}],"wp:attachment":[{"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/media?parent=11630"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/categories?post=11630"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/tags?post=11630"},{"taxonomy":"type","embeddable":true,"href":"https:\/\/www.dbi-services.com\/blog\/wp-json\/wp\/v2\/type_dbi?post=11630"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}