EBS Articles / Blogs / Perficient https://blogs.perficient.com/tag/ebs/ Expert Digital Insights Wed, 21 Feb 2024 16:05:41 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png EBS Articles / Blogs / Perficient https://blogs.perficient.com/tag/ebs/ 32 32 30508587 EC2 Instance Recovery: Fixing Block Device Issues via /etc/fstab and Rescue Instance https://blogs.perficient.com/2024/02/21/ec2-instance-recovery-fixing-block-device-issues-via-etc-fstab-and-rescue-instance/ https://blogs.perficient.com/2024/02/21/ec2-instance-recovery-fixing-block-device-issues-via-etc-fstab-and-rescue-instance/#respond Wed, 21 Feb 2024 06:22:24 +0000 https://blogs.perficient.com/?p=356783

In this blog post, I will share my firsthand experience tackling and resolving a critical issue with an inaccessible and failed EC2 instance. I’ll provide a detailed account of the problem, its impact, and the step-by-step approach I took to address it. Additionally, I’ll share valuable insights and lessons learned to help prevent similar issues in the future.

EC2 Instance Recovery

S1

An EC2 instance faced Instance Status Check failures and was inaccessible through SSM due to a boot process transitioning into emergency mode. After analyzing the OS boot log, it was identified that the issue stemmed from a mount point failure caused by a malformed/missing secondary block device; there are several steps you can take to troubleshoot and resolve the issue.

Benefits of EC2 Instance Recovery

  • Quick Diagnosis and Resolution
  • Effective Mitigation
  • Accurate Problem Localization
  • Minimal Downtime
  • Restoration of SSM (Systems Manager) Access

Here’s a general guide to help you identify and address the problem:

Step 1: Check Instance Status Checks

  • Go to the AWS Management Console.
  • Navigate to the EC2 dashboard and select “Instances.”
  • Identify the problematic instance and check the status checks.
  • There are two types: “System Status Checks” and “Instance Status Checks.”
  • Look for the specific error messages that may provide insights into the issue.

Picture1

 

Step 2: Check System Logs

  • Review the system logs for the instance to gather more information on the underlying issue.
  •  Access the AWS EC2 Instance and go to “Action” –> “Monitor and Troubleshoot” to view the logs.

Picture2

Step 3: Verify IAM Role Permissions

  • Ensure that the IAM role associated with the EC2 instance has the necessary permissions for SSM (System Manager).
  • The role should have the ‘AmazonSSMManagedInstanceCore’ policy attached.
  • If the mentioned policy is not attached, then you need to attach the policy.

Picture3

 

Certainly, if the issue is related to a malformed device name in the /etc/fstab file, you can follow the below steps to correct it:

1. Launch a Rescue Instance

  • Launch a new EC2 instance in the same region as your problematic instance. This instance will be used to mount the root volume of the problematic instance.

2. Stop the Problematic Instance

  • Stop the problematic EC2 instance to detach its root volume.

3. Detach the Root Volume from the problematic Instance

  • Go to the AWS Management Console –> Navigate to the EC2 dashboard and select “Volumes.” –> Identify the root volume attached to the problematic instance and detach it.

Picture4

Picture5

 

4. Attach the Root Volume to the Rescue Instance

  • Attach the root volume of the problematic instance to the rescue instance. Make a note of the device name it gets attached to (e.g., /dev/xvdf).

Picture6

Picture7

 

5. Access the Rescue Instance

  • Connect to the rescue instance using SSH or other methods.

Mount the Root Volume:

  • Create a directory to mount the root volume. For example: sudo mkdir /mnt/rescue
  • Mount the root volume to the rescue instance: sudo mount /dev/xvdf1 /mnt/rescue
  • Edit the /etc/fstab File: Open the /etc/fstab file for editing :
  • You can use a text editor such as nano or vim: sudo nano /mnt/rescue/etc/fstab

Locate the entry that corresponds to the secondary block device and correct the device name. Ensure that the device name matches the actual device name for the attached volume.

Save and Exit:

  • Save the changes to the /etc/fstab file and exit the text editor.
  • Unmount the Root Volume: sudo umount /mnt/rescue
  • Detach the Root Volume from the Rescue Instance

6. Attach the Root Volume back to the Problematic Instance

  • Go back to the AWS Management Console.
  • Attach the root volume back to the problematic instance using the original device name.
  • Start the Problematic Instance: Start the problematic instance and monitor its status checks to ensure it comes online successfully.

This process involves correcting the /etc/fstab file on the root volume by mounting it on a rescue instance. Once corrected, you can reattach the volume to the original instance and start it to check if the issue is resolved. Always exercise caution when performing operations on production instances, and ensure that you have backups or snapshots before making changes.

Conclusion

Resolving EC2 instance status check failures involves a systematic approach to identify and address the underlying issues. Common causes include networking problems, operating system issues, insufficient resources, storage issues, and AMI or instance configuration issues.

]]>
https://blogs.perficient.com/2024/02/21/ec2-instance-recovery-fixing-block-device-issues-via-etc-fstab-and-rescue-instance/feed/ 0 356783
Perficient’s Own Journey to Oracle ERP and HCM Cloud https://blogs.perficient.com/2022/07/28/perficients-own-journey-to-oracle-erp-and-hcm-cloud/ https://blogs.perficient.com/2022/07/28/perficients-own-journey-to-oracle-erp-and-hcm-cloud/#respond Thu, 28 Jul 2022 23:46:56 +0000 https://blogs.perficient.com/?p=314927

Be intentional. Be simple and intuitive. Focus on the future. Use a global mindset. These are the guiding principles in how Perficient’s Oracle ERP and HCM practice is transforming our own back office from Oracle E-Business Suite to Oracle Cloud.

Laptop1.2

In a recent webinar, Perficient’s Vice President of People, Andrea Lampert, Chief Financial Officer, Paul Martin, Vice President and Controller, Susan Adomite, and our Oracle ERP Directors, Matt Makowsky (Finance) and Holly Higgins-Smith (HCM) discussed our journey and how we’re modernizing core business functions with Oracle Cloud to scale and grow.

You can tune in to the full webinar here.

Laying the Foundation

About a year ago when we began to evaluate a migration to Oracle Cloud, it didn’t start with a current state assessment around our Oracle business suite. The focal point was not just on retiring traditional maintenance, software servers, hardware and infrastructure support. It was very much focused on the transformational opportunities in this project.

Room for Improvement

We took a good hard look at the things that weren’t working and put them into three buckets: poor employee experience, poor business experience and ongoing EBS maintenance.

When looking at employee experience, the current on-premises solution has a user interface that is difficult to use and not intuitive. We found this was having a big impact on our large ERP community. Another pain point is that we have multiple integrated systems. Employees and users are going from one system to another and is not very consistent.

For the business experience there’s a lot of multiple manual steps, especially as it pertains to monthly, quarterly, and annual reporting. We’re a fast-growing organization, both organically and through acquisition. So, our inability to scale our current processes and platforms is very painful and costly.

The third bucket is ongoing EBS maintenance which causes costly updates, patches and support. Upgrades with the on-premises solution takes months, if not years, and costs a lot of money around Oracle E-Business Suite.

Divide and Conquer

Our roadmap is being rolled out in two phases.

Phase 1 was focused on streamlining the financial close process and replacing some of the current EBS functionalities. We’re utilizing Accounting Hub, which links back to EBS seamlessly, bringing our sub ledgers into a single source and eliminating the need to switch applications. This provides a quick path to integrating our potential acquisitions.

In just nine months, we were able to retire our EBS general ledger and financial reporting is now all in one system 100% in the cloud.

Oracle Cloud gives us the ability to streamline our close process, enabling us to provideSusan Adomite boardroom-ready financial statements quicker than with EBS and other systems.
Susan Adomite, Vice President and Controller, Perficient

Phase 2 is centered around our people and improving the employee experience. The plan is to take our disparate systems such as time and expense, recruiting, onboarding and all other workforce management solutions and get everything connected in the cloud. Perficient’s 6,500+ employees will have a more intuitive interface with more mobile applications.

We believe our people are our greatest asset and empowering them to do their best work is our top priority as we roll out this second phase.

Work Smarter, Not Harder

With Oracle Cloud, we’ve reduced the reliance on manual reconciliation and spreadsheets, and processes are now completely automated.

Moving to the cloud removes our dependency on managing our own hardware, networkingPaul Martin Cfo and infrastructure.
Paul Martin, CFO, Perficient

Oracle keeps our applications current and modern and our business up to date with the latest improvements on a quarterly basis. We don’t have to wait years between upgrades, and with that comes additional features and functionalities that we can deploy as part of our solutions.

We’re excited to share this transformation journey, are you ready to start yours?

]]>
https://blogs.perficient.com/2022/07/28/perficients-own-journey-to-oracle-erp-and-hcm-cloud/feed/ 0 314927
Articulated & Optimized EBS Upgrade Strategy – Why, What & How? https://blogs.perficient.com/2021/07/08/articulated-optimized-ebs-upgrade-strategy-why-what-how/ https://blogs.perficient.com/2021/07/08/articulated-optimized-ebs-upgrade-strategy-why-what-how/#respond Thu, 08 Jul 2021 14:25:42 +0000 https://blogs.perficient.com/?p=289880

Due to the end of Premier Support for EBS R12.1.3 approaching quickly, R12.2 upgrades are in extremely high demand.  I thought I would share a few things I’ve learned based on my experience. Most of the DBAs across the shores in my circles are having a very hard time trying to upgrade both 19c and R12.2 within a single downtime window, due to the complexity caused by the multitenant database with dual file systems and its effect on the applications. Let’s dive in further and walk through what is needed for an effective and seamless upgrade.

Roadmap12082020

Image Courtesy – Release Schedule of Current Database Releases (Doc ID 742060.1)

Why the sudden surge to EBS database and applications upgrades?

Database 12.1.0.2 Extended Support has a fee waiver until July 2022 and EBS R12.1 Premier Support is provided through December 2021.  Due to these dates fast approaching, customers are expediting their journeys to the 19c database and EBS R12.2.  See the following documents for more details on the current support expiration dates.

Extended Support License Uplift Fee Waiver for Oracle Database 12.1 and 11.2 for Oracle E-Business Suite (Doc ID 2522948.1) – This document explains about the fee waiver on R12.1 database until July 2022.

ANNOUNCEMENTS: E-Business Suite 12.1 Premier Support Now Through Dec. 2021 and 11.5.10 Sustaining Support Exception Updates (Doc ID 1495337.1)

Whether to perform an on-prem upgrade is no doubt a tough decision due to Oracle’s Cloud ERP offering being another option. This blog is not about the on-prem vs Cloud ERP discussion, it is focused on best practices of a seamless 19c/EBS 12.2 upgrade that apply to systems hosted both on-prem and within OCI (Oracle Cloud Infrastructure).  Without a question, combined 19c/EBS R12.2 upgrades are way more complex, but the right planning makes it manageable. Here are a few thoughts based on my experience on how best to consider things pro-actively and plan it up front, and how that can help produce a smooth and seamless upgrade.

What is the best approach for patching?

We all know these upgrades are more about patching, which starts with interoperability, PSU, upgrade technology stacks and update packs and ends with many standalone patches for specific bug fixes. Your patching strategy is key to drive the upgrade effectively and reduce the downtime window. Thorough patch analysis along with appropriate parameter and option usages can make a significant difference on upgrade time.  A lot of crafting and drafting is needed to optimize this. Many times DBAs may choose to pick the ‘-1’ version on the patches like PSUs or technology stack. But due to the complexity and the number of upgrade iterations involved, the project duration could easily take anywhere from a single quarter to over a year.  As a result, whether you choose the latest and greatest available patch versions is very important otherwise you could effectively be installing different patches in each upgrade iteration.  For stability, try not to add the latest versions of patches each time, and lock the patching document on some of these patch types after a couple of iterations.  Of course this rule has to be flexible.  Circumstances like a bug or performance issue might require a newly released version of a patch, but those should undergo approval and thorough testing before they are included in the upgrade. Adding these kinds of patches right before go-live is a big no-no, and the earlier you apply that rule in the project, the better.  You can always bring an instance up to date on patch versions after go-live, as part of support best practices and regular maintenance.

What is needed for a seamless approach?

This is a complex effort and here are a few things you should consider to help you drive the complete effort seamlessly.

  1. Take it very seriously. Many IT teams may not realize the full impact of the upgrade project, based on the fact that these upgrades are not subset version upgrades as their names suggest (e.g. R12.1 to R12.2).  These teams should strongly consider project management due to the massive stakeholder involvement in these upgrades.
  2. Document every detail and solidify the implementation document with the applicable time taken for each step. Also maintain an issues log through all testing performed across the teams. Lock the document after a couple of iterations to avoid any surprises during go-live.
  3. Perform as many iterations as possible to reduce the go-live downtime window, and to improve repeatability.
  4. Split the activity proportionate to the complexity of topology in your environment and when necessary use subsequent weekends after go-live for non-critical business configurations.
  5. Make sure to thoroughly follow below doc for Oracle Best Practices to reduce the upgrade downtime effectively.

Best Practices for Minimizing Oracle E-Business Suite Release 12.2.n Upgrade Downtime (Doc ID 1581549.1)

Businessman Pushing A Sphere Leading The Race Against A Group Of Slower Businessmen Pushing Boxes

How to organize any additional configuration pieces along with upgrade?

If the business downtime granted is just a weekend, because there is so much to deliver in addition to the database and EBS upgrade, you should determine a well-planned and approved approach considerably ahead of go-live. Many times the upgrade team is expected to deliver a bunch of additional modifications within a single upgrade window on top of just the database and EBS upgrades, such as SSO, TLS, TDE, JWS, DMZ (iReceivables, iSupplier, iExpense etc), and many other oracle product configurations, like Endeca, ECC, CCG etc.  Try to only add the products to the go-live window that are critical to business operations and handle the rest as post go-live.  To achieve this, the plan must be thoroughly designed in advance to determine exactly how much can be accomplished in the approved downtime window. Always discuss with your customer the option to split the upgrade into a few separate chunks as a workable strategy to meet the business needs and limit downtime.  This entire activity is very dynamic and it can be accomplished very smoothly with the right experts involved.

There are many more approaches to streamlining or reducing the downtime while still achieving an effective and efficient upgrade. Please reach us out for any questions on EBS database and application upgrades.  Our panel of Perficient experts is always here to serve and support your business.

]]>
https://blogs.perficient.com/2021/07/08/articulated-optimized-ebs-upgrade-strategy-why-what-how/feed/ 0 289880
NGL Webinar Q&A: Remote Delivery: From EBS and Hyperion to Oracle Cloud https://blogs.perficient.com/2021/04/07/ngl-webinar-qa-remote-delivery-from-ebs-and-hyperion-to-oracle-cloud/ https://blogs.perficient.com/2021/04/07/ngl-webinar-qa-remote-delivery-from-ebs-and-hyperion-to-oracle-cloud/#respond Wed, 07 Apr 2021 18:24:46 +0000 https://blogs.perficient.com/?p=290776

Below are some key questions asked and answered during the Q&A portion of a recent on-premises to cloud story: Remote Delivery: From EBS and Hyperion to Oracle Cloud webinar. To view the entire webinar click here

Why did NGL select Oracle’s ERP Cloud product? As opposed to upgrading to E-Business Suite or moving EBS into the cloud as a hosted service. Did NGL have other options that they looked at?

[NGL Energy Partners] We did look at other options. Obviously, we were on a very dated version of EBS. We also were on Microsoft AX platform that we evaluated. We looked at other large systems, like SAP. We wanted to get as out of the box as possible, we wanted to have best practices across our businesses and we didn’t want to give people an excuse or reason to deviate from those best practices. So we thought going to the cloud where you’re limited in the amount of customization when it comes to things like processes, would be beneficial to us. Again, we wanted to limit the amount of customization that we would have, but we want to stay on the cutting edge.

We consulted with our IT group on what their preference was, whether they wanted on-premises versus cloud. You obviously have to weigh the economics of that decision as well. And at the end of the day, this was the best fit for NGL. Everybody has to make those decisions themselves. We had been using Oracle previously, so that was a benefit in choosing Oracle and going into the cloud. Those were some of the factors that weighed in our decision.

I think it is unique for each company. Everybody has to look at their own situation and sometimes you want to total change and move away from the platform you may be on, or you want to stick with some consistency. Those are things that are that are unique to everybody.

Can you provide some examples of efficiencies that helped to streamline the financial closed cycle?

[NGL Energy Partners] What we’re seeing at this point in time, is that by having all of our data in one location, we can generate reports much quicker than what we’ve been able to historically. One of the things that really drives your financial close as being able to closure your revenue and your expenditure processes. Those are still from a business perspective, managed in. For example, we use Rightangle for our liquids business, or we use custom built system in our water business for volumes. You still have to get those systems close, which does take several days, generally after the end of a month. You’re not going see any benefit there or we have not yet from an Oracle Cloud perspective, but getting all that data into one place has allowed us to accelerate some of our reporting.

That being said, we’re four months in and there’s a big learning curve in the new system, so each month we’re getting better.

I think we will ultimately be able to shave a day or two off of our close process, so to speak, to close the books. Having real-time reporting is going to be a significant benefit where we don’t have to generate reports on just a monthly basis. We can get some real-time information out of the system. And then, being able to essentially close the reporting process for each month on a much quicker basis, will help expedite and give us more time to analyze.

A lot of times as accountants we spend so much time accumulating the information that we don’t really get a chance to step back and look at it and say, “Does this all make sense?” That’s where I think we’ll get the major benefit.

How do you handle enhancements and customization changes? Does Oracle Cloud provide a development environment for migrating those changes to production?

[Perficient] A lot of people say or think there’s just no customizations in the cloud. I would say in the traditional sense, there aren’t  “customizations” in the cloud – you really can’t touch the code or the software directly. But there’s what’s called platform as a service (PaaS) where you can build custom applications of your own, and then integrate what’s called web services through REST APIs or soap web services that insert data into the environment and pull data out. So, that’s how you might integrate a custom application or you can integrate an existing legacy application, such as NGL is doing with a few of their operational systems where customer bills are being generated and then being imported into the cloud.

Any type of form enhancements, there are ways to do personalization in the traditional sense that we used to do in EBS. And most of the things we have found, you can do. This is one of many cloud projects we’ve delivered, and we have yet to meet a challenge that we honestly just could not fill.

[NGL Energy Partners] Yes, we haven’t dev a test and a production environment, and Matt can probably speak better to that an how they divvy those out, if that comes along with a number of user licenses, but we have plenty of environments to do our development efforts.

[Perficient] Thanks for reminding me, I forgot to mention the test platforms that Oracle does provide. Yes, when you subscribe to the product, Oracle gives you the production test environment. And then optionally, a development environment you can buy as many as you want. Usually we find you shouldn’t need more than two, three at the max.

Going back to the technical question, the code does not exist. It does not sit resonant in any one of those environments, it sits in the integration layer. Whether it’s MuleSoft, or Dell Boomi, or an integration platform of our own that’s based on Python type code. The code really exists there. It is just a matter of where you point it.

 

]]>
https://blogs.perficient.com/2021/04/07/ngl-webinar-qa-remote-delivery-from-ebs-and-hyperion-to-oracle-cloud/feed/ 0 290776
Creating an EBS Target application in Data Management from a SQL query https://blogs.perficient.com/2021/04/05/creating-an-ebs-target-application-in-data-management-from-a-sql-query/ https://blogs.perficient.com/2021/04/05/creating-an-ebs-target-application-in-data-management-from-a-sql-query/#respond Mon, 05 Apr 2021 14:27:45 +0000 https://blogs.perficient.com/?p=289653

When integrating an on-premise system like an E-Business Suite application with an Oracle Enterprise Performance Management Cloud application, SQL queries can be used as means to extract the data from the source database. You can extract the data and then load the data directly to the EPM Cloud applications using the EPM Integration Agent. The EPM Integration Agent executes a query against an on-premise relational database and then loads the data to the EPM Cloud application.

For an on-premise EBS system, a SQL query connecting the various database tables can be written to output data in a format ready to be loaded into the EPM applications. For this purpose, a corresponding ‘Target application’ must be created in Oracle Data Management to host the data from the SQL query, and load the data to the desired target EPM application.

The Target Application in Data Management for an on-premise EBS system can be created by choosing the source system as ‘EBS GL Balance’. But when using a SQL Query to extract data from the source, using the Query output as the ‘Data Source’ to build the Target Application yields the best results. This way, the Target Application dimensions are created exactly as the corresponding columns exist in the SQL Query and there is no scope for a mismatch between the Query and the columns in Data Management.

Using this SQL based integration, the system executes the required Query on the source data in the on-premise relational database, offloads processing, extracts and transforms the data at the source level, and then loads the data directly to EPM Cloud. This way, the EPM Cloud database is bypassed for staging and processing, eliminating any performance bottlenecks and improving the performance and scalability of the load process.

In this article, the detailed steps required to integrate SQL based EBS data with an EPM Application are discussed.

This article assumes that your EPM Application cubes and dimensions are already configured and set up, you have a working SQL query that extracts all the dimensions and corresponding data from the Oracle database, an EPM Integrations Agent is installed, and that you have some familiarity with Oracle Data Management integration tool.

Below is a step-by-step process to integrate an on-premise EBS system with an EPM Cloud application.

Create Queries in EPM Data Exchange

The first step for this integration is to create a Query for the data pull from Oracle database. Navigate to Data Exchange in the Application tab in the EPM Cloud application.

01.dataexchange

Click on Actions and select Query.

02 2.query

Add a new Query.

03 2

Provide a Query Name and add the SQL Query used to pull the required data in the Query String. Save the Query. The query name is also used on the Application Filter page in Data Management to identify the data extract query when registering the SQL data source in the target application.

04

04.5

Query is saved.

05

Export the Query output into a CSV file

Open a Database Management toolset like a SQL developer and run the Query that was saved in Data Exchange.

06 6

Right click on the Query Result and click Export.

07

Select the format to be CSV.

08

Select Left and Right Enclosure to be none.

09

Click on Browse and choose a destination for the Export.

10

Click Next to see the Export Summary.

11

Click Finish.

12

Query output is saved.

13 2

Create an EBS Target Application with Query output

Open Oracle Data Management in the Application tab in the EPM Cloud application.

14

On the Setup tab, click on Target Application.

15

Click Add to create a new Target Application for the EBS Query.

16

Select Data Source.

17

Choose Source System as On Premise Database.

18

In the Data Management directories, select a folder, browse and upload the CSV extract of the SQL query previously created.

19 2

Click Ok. Provide an optional Prefix to the Target Application name.

20

Click Ok again to import the SQL Query output columns as Target Application Dimensions.

21 2

Click Save to create the Target Application.

22

Update Target Application Filters

In the Target Application Details section, click on Application Filters tab.

23

Enter the following for the filters:

  • Data Extract Query: Name of the Query created in Data Exchange
  • Delimiter: ‘,’ (Comma)
  • Credential Store: Cloud
  • JDBC Driver: Oracle
  • JDBC URL: Oracle EBS Database JDBC connection
  • Username: Database user with access to tables and schemas used in the Query
  • Password: Password for the Database User

24

Save the updates to the Target Application.

22

Finish the Integration Setup and Import data

Create an ‘Import format’ using the Target Application created for the EBS Query as the Source, and the EPM Application as the Target.

25

Create the ‘Location’ for the Import format.

26

Create Global and Application ‘Period Mapping’ for the Target EPM Application.

27

Create ‘Data Load Rules’ and ‘Data Load Mappings’ for the ‘Location’.

28

Run the Data Load Rule to extract data from EBS and Import into the required EPM Application.

29

Review the ‘Data Load Workbench’ to validate that the SQL Query output matches the data in the Workbench.

30 2

You are all set to import data from EBS into the EPM application!

 

 

 

]]>
https://blogs.perficient.com/2021/04/05/creating-an-ebs-target-application-in-data-management-from-a-sql-query/feed/ 0 289653
Introducing Our Partnership with Kyriba https://blogs.perficient.com/2020/10/12/introducing-our-partnership-with-kyriba/ https://blogs.perficient.com/2020/10/12/introducing-our-partnership-with-kyriba/#respond Mon, 12 Oct 2020 18:52:31 +0000 https://blogs.perficient.com/?p=282152

We are excited to announce our partnership with Kyriba and dedicated practice to provide our Oracle clients with a robust cloud-based treasury management solution.  Specializing in cloud-based business transformation, Perficient’s new Treasury Technology practice will improve client’s global banking footprints through bank connectivity and address concerns for payment fraud.

Named a leader by IDC for treasury and finance, Kyriba optimizes cash and risk management, payments and working capital strategies through a highly secure Software-as-a-Service platform.

Bank connectivity is one of the most complicated aspects of an ERP project. The top reasons are as  follows:

Treasury Management Hero

  • Globally, there is no standard format that is utilized by banks. Many countries, and even banks within those countries, have their own distinctive file layout, so each payment file and bank statement file may require custom development and testing.
  • Payment fraud is now a high-priority concern
  • The parts of ERP projects that are dependent upon bank connectivity and communication are subject to the timing constraints of the bank to support integrations and testing
  • Businesses need the flexibility to define new banking relationships without the large, inflated IT projects
  • Current SWIFT compliance standards require a company to maintain internal SWIFT domain proficiency, requiring annual review of documentation and re-certifications.
  • Building bank connectivity without the Kyriba solution requires costly and risky custom development

“Perficient’s Treasury Technology practice is part of its Oracle National Business Unit as we believe the solution is synergistic with ERP implementations especially cloud migrations that provide business transformational opportunities for our clients,” said Stuart Massey, General Manager at Perficient.  “With more than 2,000 clients worldwide and 65,000 total users, we believe Kyriba will continue to revolutionize the treasury management space.”

]]>
https://blogs.perficient.com/2020/10/12/introducing-our-partnership-with-kyriba/feed/ 0 282152
Is It Time to Upgrade Oracle? https://blogs.perficient.com/2020/08/28/is-it-time-to-upgrade-oracle/ https://blogs.perficient.com/2020/08/28/is-it-time-to-upgrade-oracle/#respond Fri, 28 Aug 2020 18:22:42 +0000 https://blogs.perficient.com/?p=280600

Current Support Policies

If you are on Oracle EBS 12.1 or database 12c or below, you might want to consider upgrading.  Oracle support policies change, so be sure and check the latest updates, but at the time of this writing, premier support for EBS 12.1 ends in December 2021. Currently Oracle does not plan to offer extended support for EBS 12.1. However, if you upgrade to EBS 12.2, premier support should extend through December 2031. Effectively that is a ten-year support extension.

On the database side, if you are on 11.2.0.4 or earlier, all forms of support will cease at the end of 2020. Oracle de-supported version 12.1.0.1 in August of 2016. Version 12.1.0.2 has been in extended support since August 1, 2019 and that will continue until July 31, 2022. If you are using it for EBS 12, you do not have to pay for the extended support, but everyone else does. Oracle does support version 12.2.0.1 for error correction and patching until November 2020, but also has no plans to provide extended support for it.  The current recommended release is 19c. It is fully supported through March 2023, with extended support going through March 2026.

What to Do

If you are on EBS 12.1 or 12.2 and not on 12.1.0.2 or just using an Oracle database before 19c, and you want standard support, you will need to upgrade soon. Depending on your size, an Oracle major release upgrade can take a long time to plan and execute. The 12.2 upgrade includes some major changes to EBS OS architecture that can increase the complexity and time required as a result.  EBS and database upgrades can be done together or independently.   You might also want to use this opportunity to move off your own older hardware and do a Lift and Shift to the Oracle Cloud Infrastructure. Or maybe it’s time to consider moving to Oracle Cloud ERP. All these options are available and Perficient is here to help you determine and execute the one that is best for your organization.

For more details, see the Oracle Lifetime Support Policy for Oracle Applications, and the Release Schedule of Current Database Releases (Doc ID 742060.1)

]]>
https://blogs.perficient.com/2020/08/28/is-it-time-to-upgrade-oracle/feed/ 0 280600
Oracle EBS Direct Connect Configuration in OneStream https://blogs.perficient.com/2020/08/25/oracle-ebs-direct-connect-configuration-in-onestream/ https://blogs.perficient.com/2020/08/25/oracle-ebs-direct-connect-configuration-in-onestream/#respond Tue, 25 Aug 2020 20:16:41 +0000 https://blogs.perficient.com/?p=280327

As mentioned in part one, the four types of data integrations are delimited file, fixed file, data management, and data connector. In this blog, I will explain how to set up a direct connect that will pull data into OneStream from an external data source without the need for a file.

This article assumes that your cube and dimensions are already configured and set up in OneStream, your external data source is Oracle EBS, you have familiarity with business rules (VB.NET), and you have read part one as we will be going through a similar process to set up our EBS direct connect.

The high-level steps we need to perform are:

  1. Save database connection string on the OneStream Application Server.
  2. Restart IIS
  3. Test connection
  4. Create Connector business rule
  5. Create Data Source
  6. Create Transformation Rules
  7. Create Workflow Profile
  8. Load Data

OneStream XF Server Configuration

Navigate to the OneStream XF Server Configuration application on your on-premise instance (or work with OneStream Cloud Support to effect these changes on your Azure instance).

Select file> New Application Server Configuration File

In the Application Server Configuration Settings, locate File Share Root Folder. Here you will add the path of your OneStreamShare\FileShare folder.

Navigate to the Databases section.
Select the three ellipses on the far right of Database Server Connections (Collection).

Add a new member. In General, settings, provide the member a name and select True for Is External Database. Once the name has been entered, you will see it change under Members.

Next, select the Database Provider Type that applies to your application.

Scroll to the Connection String setting and enter the connection string you will be using to connect to the EBS database.

The figure below shows examples of connection strings depending on the server. Your connection string should be similar to the formats provided.

Save the .XML file to the OneStreamShare\FileShare location.

 

IIS (Internet Information Services Manager)

Once the.XML file has been saved, the server needs to be restarted to see your changes in OneStream.

Click the appropriate server under connections on the left side.

Click Restart under Manage Server on the right side.


Testing Your Connection

Now that the connection has been set up, it’s helpful to test the connection and see if it works from within OneStream.

Login to the OneStream application.

Navigate to the Application tab > Dashboards.

Create a new Dashboard Maintenance Unit.

Provide your Dashboard Unit with a Name.

Select the newly created Dashboard Unit and click Create a Data Adapter.

Enter the following information:

Note: The external database connection will be the name that was given in the App Server Config File.

Next to SQL Query, click on the ellipses. Here, you can enter a sample query to test if your connection is working. (Check with your database administrator if you need a sample query to run against your database). The figure below shows the standard structure of a SQL Query.

Click the Test Data Adapter button to run the query and display results.

Business Rule

Now that our connection is working, we can create a business rule that’ll pull data from EBS into your OneStream application.

Navigate to Application > Tools > Business Rules

Click Create Business Rule.

Enter the following information:

The integration type we will be using is a Connector. Connectors are used for direct integration between the source system (EBS) and OneStream.

The business rule must contain the connection member we created and queries that are formatted to pull data from EBS.

There are three main functions in this business rule:

  1. GetConnectionString() – returns the connection string
  2. GetFieldListSQL() – returns the field list to map to dimensions
  3. GetDataSQL() – returns the data records

Enter in the rule and hit save.
Compile the business rule to check for syntax errors.

Data Source

Click Create New Data Source.

Enter the required information for your application. I am loading my data into the Consol cube, Actual scenario type.

Navigate to Connector Settings and specify the connector name of the business rule that was just created.

Once you click save, it will populate fields on the main window.

Apply selections for dimensions. This process is similar to part one; however, the data source pulls and displays fields from the query created in the business rule. You will be applying selections to the populated areas. If there are dimensions that don’t have corresponding fields, you will have to hardcode a static value.

Transformation Rules

Next, we’ll be creating our transformation rules.
Navigate to the Application Tab > Transformation Rules.

First, we will create our rule groups for each dimension. Click the Create Group button and enter values into the blank fields. Select which dimension you are making a rule group for.

Once the group is created, you can add transformations. There are five types of transformations: One-To-One, Composite, Range, List, and Mask. In the last blog, we discussed how to add One-To-One mappings. Here, we’ll be demonstrating how to add Mask mappings.
Note: Please refer to part one if One-To-One mappings apply to your EBS direct connect.

Below is an example of a Mask. PassThru is the name I gave for this rule. The rule expression is essentially the source value. By adding * to source and the target value, you are telling OneStream that the values (Accounts, Entities, etc.) coming in from EBS will be the exact same in OneStream.

Once the groups have been created, we can assign them to a rule profile.

Click the Create Profile button and enter information specific to the profile.

Select the newly created profile under Rule Profiles and click the Manage Profile Members button to assign groups to a profile.

Note: You can select one or more groups and click the right arrow to add them to the profile.

Make sure to save all your changes.

Workflow Profiles

Navigate to Application Tab > Workflow Profiles.

Assuming a cube root workflow profile has been created, we can make a child under the root workflow profile.

Select the cube profile and click Create Child under Workflow Profile.

Enter a name and select the following options. The default cube name will be what you have set up for your application.

Once the Review profile is set up, we want to create a profile to be used for importing our file. Select the Review profile just made and click Create Child under Current Workflow Profile. Enter and select the following options. Template Name will be dependent on how your application is set up.

Click the arrow next to the new EBS profile, and select the Import step. Select the Actual scenario and scroll down to Integration Settings. This is where we will set our data source and transformation profile. By doing so, we are linking this workflow profile to the data source and transformation profile we created earlier.

In this example, I first selected the Import step, then the Actual scenario, then Data Source Name = EBS, and Transformation Profile Name = EBS.

Now that we have our OneStream XF Server Configuration Application, Connector, Business Rule, Data Source, Transformation Rules, and Workflow Profile set up, we are ready to load data!

Loading Data

Navigate to the OnePlace tab and select the EBS Workflow.
Note: Step by step instructions on how to select your workflow are provided in part one.

Click the arrow to the left of the month you want to load to expand the workflow steps. Select the Import step.

Next, click Load and Transform. Notice how we are not provided with a field to select a file and load. This is because data is directly being pulled from EBS into OneStream. Therefore, clicking OK will automatically load data into the stage.

After clicking OK, you will see that our data has been imported into the stage area of OneStream. Next, Click the Validate chevron at the top and click the Validate button below it. This will execute any validation rules that have been configured and check for any errors in our mappings.

The last step is to load our data into the cube. Select the Load chevron and click the Load Cube button below it. If the Load chevron turns green, all our data is now in the OneStream analytic cube and ready to be consolidated.

Thank you for tuning into this two-part series on loading data into OneStream. I hope this blog helps you understand the process of setting up a direct connect in OneStream for EBS integrations!

]]>
https://blogs.perficient.com/2020/08/25/oracle-ebs-direct-connect-configuration-in-onestream/feed/ 0 280327
Is It Time to Consider Oracle Lift-and-Shift? https://blogs.perficient.com/2020/05/05/is-it-time-to-consider-oracle-lift-and-shift/ https://blogs.perficient.com/2020/05/05/is-it-time-to-consider-oracle-lift-and-shift/#comments Tue, 05 May 2020 14:30:25 +0000 https://blogs.perficient.com/?p=274157

Are you one of those Oracle E-Business Suite (EBS) customers who are not quite ready to go to Oracle ERP Cloud, but your servers are long since out of warranty? Have you lost the desire or expertise to continue buying and maintaining your own hardware, but you don’t want to give up control or take the chance to tie yourself down to a third-party host? If someone is hosting your servers and you decide you don’t want to work with them anymore, you would have to pay a pretty penny to move them elsewhere? That’s enough risk to make external hosting appear not worth your while.

If this sounds like you, maybe you should consider doing a Lift-and-Shift with Oracle and moving your on-premises EBS systems to the Oracle Infrastructure-as-a-Service (IaaS). By doing that, not only do you permanently eliminate your need to worry about the underlying hardware and backups, but you still retain complete control over how your systems are maintained and by whom. You can operate as you do today, or it may be advantageous to work with a partner to help you support your environments. Either way, you make the decisions and stay in control.

Like all cloud infrastructure platforms, you would only pay for what you need, and it’s very easy to expand or contract accordingly. It doesn’t have to be just EBS either, you can migrate any applications you choose. But if you are an EBS customer, Oracle makes the migration effort even easier by including specific functionality to help you achieve this such as the Oracle E-Business Suite Cloud Manager UI and the Oracle E-Business Suite Cloud Backup Module. Follow this link on Oracle’s site for more details.

I have worked with a number of clients over the years that opted to lift-and-shift and the majority are happy they made the move. The common concern about the loss of control just wasn’t a factor. The upside, the peace of mind knowing they are housed in world-class data centers with automatically backed up servers configured and maintained by tools built by the same company that wrote the software means they have done just about as much as reasonably possible to minimize their risk of data loss.

]]>
https://blogs.perficient.com/2020/05/05/is-it-time-to-consider-oracle-lift-and-shift/feed/ 1 274157
From On-Premises ERP to Cloud: Business Transformed https://blogs.perficient.com/2019/12/12/from-on-premises-erp-to-cloud-business-transformed/ https://blogs.perficient.com/2019/12/12/from-on-premises-erp-to-cloud-business-transformed/#respond Fri, 13 Dec 2019 01:55:10 +0000 https://blogs.perficient.com/?p=248570

Perficient Presents at Oracle OpenWorld 2019 – Live from the show floor, the GM of Perficient’s ERP practice discusses five key considerations for your cloud journey, starting with education (how things are done in the cloud versus on-premises) through plan deployment.

For ERP customers contemplating a cloud journey, this 8-minute video provides some insight into what you can expect. You’ll learn that a move to the cloud is not just about running your business on the latest and greatest platform, rather it’s about business transformation and reengineering business processes. To learn more, watch the video now!

As an Oracle Platinum partner with Oracle Cloud Excellence Implementer status, you can be assured our team can help you whether your company chooses to stay on-premises or move to the cloud. Learn more about our partnership and how we can help you with Oracle here.

]]>
https://blogs.perficient.com/2019/12/12/from-on-premises-erp-to-cloud-business-transformed/feed/ 0 248570
The Road to Modernization: From EBS to Oracle ERP Cloud [Webinar] https://blogs.perficient.com/2019/10/18/the-road-to-modernization-from-ebs-to-oracle-erp-cloud-webinar/ https://blogs.perficient.com/2019/10/18/the-road-to-modernization-from-ebs-to-oracle-erp-cloud-webinar/#respond Fri, 18 Oct 2019 21:04:57 +0000 https://blogs.perficient.com/?p=245870

Save the date for this terrific customer story next week featuring Midcoast Energy, LLC. The company had antiquated and complex business processes that could not scale to meet its needs. Facing an expensive ERP implementation, the company decided a cloud-based platform would provide a strategic advantage.

Midcoast Energy, LLC implemented Oracle Cloud for Financials, Project Portfolio Management, and Procurement, and partnered with Perficient to deliver the implementation in a six-month timeframe.

Join us as Midcoast Energy, LLC’s Archana Shah, director enterprise systems, and Harish Gulati, manager of ERP systems, discuss the company’s migration to Oracle ERP Cloud, including lessons learned and how Oracle Cloud could benefit your organization as well.

Discussion will include:

  • Challenges with the legacy business processes
  • Why Midcoast Energy, LLC chose Oracle Cloud
  • Benefits realized

We’d love to have you attend our live event, but if you’re unable to make it, all registrants will receive links to the presentation materials and a recording of the on-demand webinar post-event.

]]>
https://blogs.perficient.com/2019/10/18/the-road-to-modernization-from-ebs-to-oracle-erp-cloud-webinar/feed/ 0 245870
Period End Close 123 in Oracle Cloud ERP https://blogs.perficient.com/2018/09/09/period-end-close-123-in-cloud-erp/ https://blogs.perficient.com/2018/09/09/period-end-close-123-in-cloud-erp/#respond Sun, 09 Sep 2018 20:58:57 +0000 https://blogs.perficient.com/?p=231059

The Period End Close Challenge

As an EBS consultant for the last 20 years, one of the hardest parts of my job has been explaining the period end process.

The reconciliation reports, roll forwards, the disparate screens, lack of consistency across the modules. Don’t get me started on inventory, with its archaic requirement to close each inventory organization on its own.  Manage 2000 warehouses and let me know how that works out for you.

Cloud is a Game Changer

The game in Cloud has changed. Oracle has provided Period Close Monitor, easy to use dashboards and reconciliation tools, manage by exception capabilities.  It’s *almost*all at your fingertips.

I feel confident Oracle will continue to streamline it in future releases.

 

Period End Close Monitor

The Period Close Monitor allows a quick and immediate snapshot of all the sub-ledgers, with actionable capability so *exceptions* can be dealt with in *real time*.  You can see record counts of not accounted transactions or transactions not yet transferred to general ledger.  You can *drill* into the records and see the specific issues so you can resolve them.  No need to dig through reports.

Reconciliations on Demand

In Accounts Receivable and Accounts Payable, there’s a reconciliation dashboard which performs two functions at one time: Reconciles the AP/AR trial balance to the GL Balance – with drill down capability to see the differences, in categorized buckets (manual journals, for example).  And the roll-forward takes you from the previous months ending balance to the current months ending balance.  Each element of the formula is drillable so you can see each transaction that makes up the total.  Of course, you can always fall back on the standard reports for validation or supporting documentation.  Each reconciliation may be saved for future audit, by simply running the “reconciliation extract” process and giving it a unique name such as “AP-JUL-2018”.  Doing this means you don’t have to dig through cabinets of reports or network drives to find what you did last summer.  Yes, Oracle knows what you did last summer.

Room For Improvement

Fixed Assets, while included in the close monitor, does not yet have a reconciliation dashboard.  This is one area I hope Oracle makes some improvement on.  All the traditional reports were brought forward from EBS.  One new thing Oracle has done in cloud, as of Release 13 – You can now re-open the previous period if no transactions were yet created in the new period – Nice improvement over EBS!  There is also a new feature to  “Manage All Books” where you can quickly run create accounting, copy to tax books, run depreciation, close the periods all from one screen, for each FA Book!  Just select the book and click the buttons.  If only we could get those reconciliation dashboards!

A Huge Leap Forward

The most significant improvement is BY FAR the Inventory month end close process. Let me explain: There is NO inventory month end process.  Let me explain further.  If you have 1 or 1000 inventory organizations, you have no inventory close process.  You can transact in inventory to the end of time, without ever worrying about close – separating operations from accounting and finance.  Sheer brilliance.

This is what an ERP should be – a masterpiece of integration minus obstruction.  How did Oracle do this?  Simple.  They moved all the accounting functions into the brand new Costing module.  Costing now controls all of the accounting and period end processes.  The close process is conducted at he “Cost Book” level, which ostensibly means, as long as all your inventory organizations belong to the same Cost Book (within a country), there is only one single inventory close.  ONE. SINGLE. INVENTORY. CLOSE.  Every inventory transaction is interfaced over to costing.  The costing engine runs.  Any errors sit in the costing close.  Allowing operations to move forward, without getting caught up in the costing process.  As with EBS, standard costs can be added in after the fact, and the costing engine will correct any errors as corrections are made in real time.    Operations is now free from the financial burden of their month end duties.

 

The second biggest improvement is the month end accrual process for expense.  As with Inventory, Oracle has essentially removed the month end close from Procurement.  Let me say that again: There is no month end process for procurement.  In the Costing tool, financial analysts can run the month end accrual processes on demand, whether AP is closed or not.  As AP Invoices are matched to PO’s or receipts, they fall off the next accrual report and accounting entries are created automatically.

 

With these changes in the period close process,  there are fewer dependencies between operations and finance teams, placing all the tools the finance department needs in the finance departments hands.  This is how it should be – a 100% top down view, without bogging down operations – they have product to ship, and how many times have I heard in the realm of EBS – “My operations teams are not filled with CPAs”.  Cloud ERP is about efficiency, and it has achieved it.

 

Speaking of dependencies, given the new structure, there are far fewer dependencies between the modules as well, however common sense still prevails.  You can close FA any time, but you still shouldn’t close it before AP.  You will still more than likely follow the traditional sequence, starting with AP and AR.  Costing can be done any time, just keep running the integration between inventory and costing daily.

 

For assistance with your period end – contact PERFICIENT!  We are ready to assist you *right now*!

]]>
https://blogs.perficient.com/2018/09/09/period-end-close-123-in-cloud-erp/feed/ 0 231059