Blog - comments

Hi Anthony,If you get that, it means that you are trying to access Liferay using Kerberos from the m...
Great..There are various careers options available power management. Detailed information about the ...
Srushti Patel
Thanks for a detailed instructions on database firewall policy, We are trying to create a Novelty po...
Sailaja Eswara
Thank you, Stéphane. Your well-explained solution allowed us to get our product installed before a t...
Ed Rogers

Thanks!

Jessie
Blog dbi bloggers Franck Pachot

Franck's Oracle blog

Blog of Franck Pachot, Consultant at dbi services

  • Home
    Home This is where you can find all the blog posts throughout the site.
  • Categories
    Categories Displays a list of categories from this blog.
  • Tags
    Tags Displays a list of tags that have been used in the blog.
  • Bloggers
    Bloggers Search for your favorite blogger from this site.
Franck Pachot

Franck Pachot

I'm a Senior Consultant, and Oracle Technology Leader at dbi services (Switzerland).
Certified DBA (OCM 11g, OCP 12c, Performance Tuning Expert, Exadata Implementation) I cover all database areas: architecture, data modeling, database design, tuning, operation, and training.
My preferred area is troubleshooting oracle and performance tuning, especially when I acheive to enable an efficient collaboration between the developers and the
operational team.


As an Oracle Ace, I participate in the Oracle Community in forums, blogs, articles and presentation. You can follow my activity on this blog: RSS and my twitter account: @FranckPachot


O_Database12c_Admin_Professional_clrOCE_ODb11gPerfTun_clr11gocm_logoO_ACELogo_clr


 


 


 




12c is coming with more dynamic sampling, now called dynamic statistics and using the new Adaptive Dynamic Sampling algorithm. The goal is to have better estimations and better estimations gives better execution plans. However, this new approach will increase parse time because dynamic sampling kicks in more often, reads more blocs, and run more queries.

It's probably not a problem for applications that are well designed, using bind variables to avoid to many parses, having good statistics (histograms where it makes sense, extended statistics for correlated columns). The SQL Plan Directives are there to trigger dynamic sampling only where misestimates have been observed. An OLTP application should not parse often, and should not have huge misestimates. A reporting use-case can spend more time on parsing and the few seconds spend to do dynamic sampling will probably benefit to the execution time.

In addition to that, in order to lower the dynamic sampling overhead, Oracle 12c Adaptive Dynamic Sampling run its queries with the /*+ result_cache(snapshot=3600) */ hint. The result is cached in the result cache and is not invalidated by dependencies. So even when the underlying table is updated, the dynamic sampling result is still valid in cache for 3600 seconds. This is why doing more dynamic sampling is not a big overhead according that:

  • your result cache is sized accordingly. The default (0.25% of MEMORY_TARGET or 0.5% of SGA_TARGET or 1% of SHARED_POOL_SIZE) is probably too low to fit all the dynamic sampling result for frequently parsed statements.
  • your result cache is enabled, meaning that you are in Enterprise Edition
So the question of the day is that I want to know if the RESULT_CACHE hint is just ignored in Standard Edition, or if there is a mechanism that allows it from Adaptive Dynamic Sampling.

If you have a bad application (not using bind variables, parse at each execution) and you are in Standard Edition, then there is a risk that the current parse contention you suffer from (CPU and latches) will be more problematic (more CPU and I/O). Let's try the following:

declare
 c sys_refcursor;
begin
 for i in 1..100
 loop
  open c for 'select count(*) COUNT'||i||' from DEMO_TABLE where a+b=c+d';
  dbms_sql.return_result(c);
  null;
 end loop;
end;
/
which run 100 times the same statement not using bind variables. So I'm parsing it each time, but it's reading the same table with same predicate, so the result of dynamic sampling should not change a lot.

I'll run it in Standard and Enterprise editions, with no dynamic sampling, and with the new AUTO level.

Hits: 236
Rate this blog entry:
0
Continue reading 0 Comments

Today I've presented SQL Plan Directives at the SOUG Romandie event. I had a question about the import/export of directives by Data Pump. The idea is that a lot of testing has been done on QA in order to validate the upgrade to 12c. A few directives had bad consequences (see Ludovico Caldara blog post for an example), then directives have been disabled. When going to production, they want to start with those directives disabled. Yes, they can be imported. We have to pack them into a staging table, import that table, and unpack them. It's similar tho what we do with SQL Plan Baselines.

Hits: 163
Rate this blog entry:
0
Continue reading 0 Comments

If you have read Matching SQL Plan Directives and queries using it then you know how to use the '+metrics' format of dbms_xplan.

Tagged in: Oracle 12c Performance
Hits: 235
Rate this blog entry:
0
Continue reading 0 Comments

This blog post is something I had in draft and Laurent Schneider blog post reminds me to publish it. With the right RMAN configuration you should not have to managed backup files yourself. The RMAN catalog knows them and RMAN should be able to access them. If you want to keep a backup for a long time, you just tell RMAN to keep it.
But sometimes, RMAN is not connected to your tape backup software, or the backups are not shared on all sites, and you have to restore or copy the set of files that is needed for a restore database or a duplicate database.

Hits: 472
Rate this blog entry:
0
Continue reading 2 Comments

If you followed the 1M rows insert variations, you have seen how something simple brought me into an investigation with the latest tools available to see where the time is spent. When the time is CPU time, there is no wait event to get the detail of DB Time. I've used perf events in the previous post and I'll now visualize them with CPU Flame Graph. My goal is to understand why my test cases is 2x longer in 11.2.0.4 vs 11.2.0.3

The idea to use Flame Graphs come from Luca Canali:

Hits: 393
Rate this blog entry:
1
Continue reading 0 Comments

In previous post I explained that I've observed a 2x CPU usage when running my insert test case in 11.2.0.4 vs. 11.2.0.2 I discussed that with @martinberx @OracleSK @TanelPoder @xtner @Ycolin @fritshoogland @lleturgez @LucaCanaliDB on twitter and it became clear that:

Hits: 270
Rate this blog entry:
0
Continue reading 0 Comments

Oracle In-Memory option is for reporting. Oracle has chosen an hybrid approach: maintain the In-Memory Column Store in addition of the buffer cache. Dont' forget our In-Memory event, but for the moment, let's try the 1M row insert when the target table is IN MEMORY.

Tagged in: In-memory Oracle 12c
Hits: 223
Rate this blog entry:
0
Continue reading 2 Comments

This is another blog post I'm writing while reviewing the presentation I'm doing next week for SOUG and next month for DOAG. You have SQL Plan Directives used by several queries that have same kind of predicates. And queries that can use several SPD. Can you match them?

Hits: 320
Rate this blog entry:
1
Continue reading 0 Comments

This year is the year of migration to 12c. Each Oracle version had its CBO feature that make it challenging. The most famous was the bind variable peeking in 9iR2. Cardinality feedback in 11g also came with some surprises. 12c comes with SPD in any edition, which is accompanied by Adaptive Dynamic Sampling. If you want to know more about them, next date is in Switzerland: http://www.soug.ch/events/sig-150521-agenda.html

Tagged in: Oracle 12c Performance
Hits: 215
Rate this blog entry:
0
Continue reading 0 Comments

In the first variation I did a bulk insert with the PL/SQL FORALL statement. A comment suggests to add the APPEND_VALUES hint that appeared in 11gR2. APPEND_VALUES hint does the same than APPEND but it's for inserts with VALUES clause: It inserts in direct-path - directly into the datafile, bypassing the buffer cache. Of course, this makes sense only with bulk insert because you don't want to do that with only one row.

Hits: 295
Rate this blog entry:
0
Continue reading 3 Comments
AddThis Social Bookmark Button
Deutsch (DE-CH-AT)   French (Fr)

Contact

Contact us now!

Send us your request!

Your flexible SLA

dbi FlexService SLA - ISO 20000 certified.

dbi FlexService SLA ISO 20000

ISO 20000 certified & freely customizable!

dbi services Newsletter