dbi services Blog
Welcome to the dbi services Blog! This IT blog focuses on database, middleware, and OS technologies such as Oracle, Microsoft SQL Server & SharePoint, EMC Documentum, MySQL, PostgreSQL, Sybase, Unix/Linux, etc. The dbi services blog represents the view of our consultants, not necessarily that of dbi services. Feel free to comment on our blog postings.
Migrating to Oracle OEM Cloud Control 18.104.22.168.0
I recently upgraded from Enterprise Manager Cloud Control 22.214.171.124.0 to the most recent version 126.96.36.199.0. This document describes the migration procedure and the different problems encountered.
The first point to know when you are about to upgrade an existing Enterprise Manager Cloud Control 12c is that Oracle Installer will create a new Middleware home. Thus, you will need to have enough free disk space (around 7 GB) on your system to be able to upgrade correctly.
The upgrade procedure is easy to run, and as the Cloud 188.8.131.52.0 has already been installed, you won’t get any problem of failed pre-requisite checks.
I choose to upgrade an existing Enterprise Manager System with one system:
The next screen is asking for the new Middleware home:
After asking for the sys and sysman passwords, it is necessary to stop the Oracle Management Server with the command emctl stop oms - do not use the option –all.
You do not have to stop the repository database and the listener.
The only problem encountered during the upgrade was about the emkey which should be present in the upgraded environment.
To solve this problem you need to run the command:
oracle@server1:/u00/app/oracle/Middleware/oms/bin/ [oms12c] ./emctl config emkey -copy_to_repos_from_file -repos_host server1 -repos_port 1521 -repos_sid EMREP12C -repos_user sysman -repos_pwd manager -emkey_file /u00/app/oracle/Middleware/oms/sysman/config/emkey.oraOracle Enterprise Manager Cloud Control 12c Release 184.108.40.206.0Copyright (c) 1996, 2012 Oracle Corporation. All rights reserved.Enter Admin User's Password :The EMKey has been copied to the Management Repository. This operation will cause the EMKey to become unsecure.After the required operation has been completed, secure the EMKey by running "emctl config emkey -remove_from_repos".
Finally the upgrade to Enterprise Manager Cloud 220.127.116.11.0 runs successfully in less than an hour.
The next step consists to upgrade the management agents. This operation is made from the Cloud 18.104.22.168.0 console by using the Setup Menu, Manage Cloud Control, Upgrade Agents:
You have to select the different management agents you need to migrate to the 22.214.171.124.0 release. By using the Control key you can select all the remote agents and run the upgrade on all the agents.
For the agent migration itself I experienced some problems. All the remote agents were installed on the same operating system (Linux x86_64), but for only one, the Oracle procedure ran in a recursive loop and created a lot of backup directories which led to a full file system. I chose to uninstall the 126.96.36.199.0 agent and to reinstall it with the agentDeploy.sh procedure, but I also could have used the Cloud 12c console. More precisely, if you have your agent in release 188.8.131.52 with the AGENT_HOME as /u00/app/oracle/agent12c/core/184.108.40.206.0, the upgrade procedure will install in /u00/app/oracle/agent12c/core/220.127.116.11.0
The management agent on the OMS side is upgraded in the old Middleware home. For example, after the first installation of Oracle Enterprise Manager the management agent is installed under /u00/app/oracle/Middleware/agent by default. When you upgrade this management agent, it will be upgraded in the same directory. You cannot choose the new directory where it will be installed. If you forget about this particular case, think your migration is finished, and delete the old Middleware home, you will delete the management agent!!!
The only way I found to bypass this problem is to uninstall the management agent on the OMS side and to reinstall it with the Cloud 12c console or via the deploy.sh.
If you want to deinstall the management agent via the Enterprise Manager Cloud Control 12c console, you have to delete the targets associated to this agent before deleting the agent. The deletion of the targets sometimes does not work very fine, but you can use emcli to delete the targets and the agent in the following way:
oracle@vmtestoraem12c:/u01/app/oracle/Middleware_R2/oms/bin/ [oms12c] emcli delete_target -name="vmtestoraem12c.it.dbi-services.com:3872" -type="oracle_emd"Error: This agent is currently monitoring other targets. Please delete the monitored targets before deleting the agent or use delete_monitored_targets option to delete all the monitored targetsoracle@vmtestoraem12c:/u01/app/oracle/Middleware_R2/oms/bin/ [oms12c] emcli delete_target -name="vmtestoraem12c.it.dbi-services.com:3872" -type="oracle_emd" -delete_monitored_targetsTarget "vmtestoraem12c.it.dbi-services.com:3872:oracle_emd" deleted successfully
My last recommendation is to install the management agent outside of the Middleware home (for example /u00/app/oracle/agent12c) in case of a future upgrade to 18.104.22.168.0 …
Just a few days after the migration, I observed the newly upgraded agent in 22.214.171.124.0 was in pending state and did not administer the targets anymore. By looking at the Agent Process Statistics I noticed an increase in the number of open files since the migration:
After some research, I found the Metalink Note 1472085.1, which describes the problem about the agent stopping intermittently the monitoring with the 126.96.36.199.0 agent version. We can see a significant message in the agent log files:
Agent has stopped monitoring, the following errors are reported: Too Many Open Files.
The patch 6895422 is available on Metalink, but before downloading the patch it is necessary to download the correct version of the patch. We have to select the patch release matching the “Oracle JDBC/OCI Instant Client” release in the AGENT_HOME.
To get the current version you run the command:
Opatch lsinventory –details
An Opatch output for example:
[---]Oracle JDBC/OCI Instant Clkient 188.8.131.52.0[---]
In this last example, you need to download the patch 6895422 release 184.108.40.206.0 and install it with the classical opatch apply command after the agent has been stopped.
This patch is working correctly for the agent 220.127.116.11.0 release. After a few days the number of open files gets stabilized around 200 with almost 90 targets monitored by the agent itself.
The Enterprise Manager Cloud 12c upgrade to 18.104.22.168.0 is recommended, but you have to be particularly careful with the management agents and the generated bugs.