Oracle Articles / Blogs / Perficient https://blogs.perficient.com/tag/oracle/ Expert Digital Insights Tue, 15 Oct 2024 19:39:51 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Oracle Articles / Blogs / Perficient https://blogs.perficient.com/tag/oracle/ 32 32 30508587 Did You Know? Outside Processing Doesn’t Have to Be Classified as Direct Material https://blogs.perficient.com/2024/10/15/did-you-know-that-outside-processing-cost-doesnt-have-to-be-a-costed-as-direct-material/ https://blogs.perficient.com/2024/10/15/did-you-know-that-outside-processing-cost-doesnt-have-to-be-a-costed-as-direct-material/#respond Tue, 15 Oct 2024 15:15:08 +0000 https://blogs.perficient.com/?p=313762

Problem Statement:

Oracle Fusion Costing Distribution process will treat the Outside Processing (OSP) as ordinary items (Direct Material/Material).  If you are procuring direct items to work orders or have an outside service provided in a sub-assembly as a work order step, Oracle Cost Management solution will cost the two “items” as Direct Material given the Cost Component Mappings.  Also, if this is a standard costed assembly, OSP Portion will be shown as ordinary material at the standard cost definition of the item.  If in the beginning OSP Items are setup as regular items then identifying and reporting on OSP work and associating different costing methods can be a challenge.

 

 

There is a way to account for OSP charges separately.

Solution/Create:

  • Cost Category and define the OSP items under this Cost Category.
  • A new Cost element and have a new Component Group
  • A new Cost Profile and associate the newly created Component Group
  • Cost Profile Defaulting Rule

Results:

This approach will provide greater flexibility on setting a different cost method to OSP items.

Ability to define actual cost for OSP Items and see their variances in manufacturing as opposed to Procurement. (If the OSP is defined as standard costed item, the variance will be seen as Price Purchase Variance, if the OSP is defined as actual costed item, then the variance will be on the Work Order.  Did you know that you can define a standard cost for an item that has an Actual Costing profile!)

See the OSP Portion of the assembly cost as a separate cost element in Cost Rollup

Report and analyze OSP Cost in Work Order Cost separately

2024 10 15 6 58 04

Contact

Contact Mehmet Erisen at Perficient for more information on how Perficient can help you implement Oracle Fusion Cloud Supply Chain Management solutions

www.oracle.com 

www.perficient.com

 

]]>
https://blogs.perficient.com/2024/10/15/did-you-know-that-outside-processing-cost-doesnt-have-to-be-a-costed-as-direct-material/feed/ 0 313762
Three Simple Steps to Make Oracle ERP Cloud Ready for 1099 Season https://blogs.perficient.com/2024/10/04/three-simple-steps-to-make-oracle-erp-ready-for-1099-season/ https://blogs.perficient.com/2024/10/04/three-simple-steps-to-make-oracle-erp-ready-for-1099-season/#respond Fri, 04 Oct 2024 14:30:06 +0000 https://blogs.perficient.com/?p=370124

It’s 1099 season – is your ERP ready?

It only takes a few simple steps to verify your data and make sure you are fully prepared to meet the IRS deadline to publish your 1099’s.

Step 1:

Validate that your “Reporting Entities” are setup, and that they have assigned “Balancing Segments”.   A reporting entity contains the federal reporting number.  It will logically group your 1099’s by paying company from the payables invoices lines.

Step 2:

Validate that your 1099 Suppliers contain one “site” flagged as a “tax site”, that it has a “Tax Type” assigned, such as “MISC7”, a country code, and a Tax ID number on the Supplier Profile page.  If there is a “Tax Reporting Name” that is different from the Supplier Name (such as a DBA), verify it is populated with the correct data.

 

Step 3:

Create and Pay invoices.  Validate that the line items on the invoice contain “MISC7”.  To validate these invoices will be picked up on a 1099, run the report: US 1099 Payment Report.  To determine why something is missing, you can run the following helpful reports:

  • US 1099 Invoice Exceptions Report
  • US 1099 Supplier Exceptions Report

 

If you have already created invoices BEFORE your supplier settings were updated, you can run the process “Update Income Tax Details”, which will retroactively populate tax data on your invoices based on the new supplier information.

For more useful tips and tricks, contact Perficient today, respond to this blog, or email matt.makowsky@perficient.com for a free 1 hour consultation.

#oraclecloud

#oracleerp

#1099

#oraclefinancials

#oraclesupport

]]>
https://blogs.perficient.com/2024/10/04/three-simple-steps-to-make-oracle-erp-ready-for-1099-season/feed/ 0 370124
Upcoming Changes for Oracle EPM Cloud Customers https://blogs.perficient.com/2024/09/09/upcoming-changes-for-oracle-epm-cloud-customers/ https://blogs.perficient.com/2024/09/09/upcoming-changes-for-oracle-epm-cloud-customers/#respond Tue, 10 Sep 2024 00:00:18 +0000 https://blogs.perficient.com/?p=367022

Oracle routinely announces significant future changes for EPM customers in each month’s cloud readiness documentation. If you’re not aware, Oracle EPM cloud updates are published monthly. If you read the details within the ‘Important Actions and Considerations’ section (located at the bottom of each readiness release), you will notice several very interesting announcements that we’ve highlighted below.

 

Infolets Discontinuing

Starting sometime later in 2024 the Infolets functionality will no longer be supported by Oracle. The ability to create Infolets will cease to exist for Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, and Tax Reporting applications. Nothing specifically has been announced about existing Infolets, however any issues with them will not be addressed by Oracle support once support ends.

To many of us here at Perficient, the removal of Infolets is not surprising as we’ve seen many customers prefer to use Dashboards 2.0 instead.

 

Smart View Native Mode Option Deprecation

Continuing with the theme of eliminating functionality not widely used, the Native Smart View option will no longer be supported later this year. Standard will continue to be the ad hoc mode for development of enhancements by Oracle. The announcement contains two additional aspects customers need to note:

  • The expectation for existing Native-mode worksheets is that they will work “as is” when the setting is changed to Standard.
  • Smart Forms are not supported in Standard mode and there is no plan to support them in Standard mode.

In order to avoid any negative experience by end users, administrators need to proactively start testing all mission-critical ad-hoc sheets using Standard mode. If the results of the testing results in any issues, customers need to log the issue as an enhancement request on Customer Connect.

 

Removal of Data Management Job Scheduling

Data Integration now has the ability to schedule jobs. This means that Oracle will soon be removing the job scheduler within Data Management. The change is expected to occur sometime in Q4 this year. This means customers need to start migrating their scheduled jobs to the EPM Job Scheduler console immediately! To help faciliate the migration, Oracle is providing a script (“Migrate Schedules to Platform Jobs Scheduler”) within System Maintenance Tasks in Data Management. This change is applicable to the following solutions: Enterprise Profitability and Cost Management, Financial Consolidation and Close, Planning, and Tax Reporting.

 

Have a question about the above changes, or want to understand more about Oracle EPM cloud updates, drop us a comment!

]]>
https://blogs.perficient.com/2024/09/09/upcoming-changes-for-oracle-epm-cloud-customers/feed/ 0 367022
Discoveries from Q&A with Enterprise Data using GenAI for Oracle Autonomous Database https://blogs.perficient.com/2024/04/09/discoveries-from-qa-with-enterprise-data-using-genai-for-oracle-autonomous-database/ https://blogs.perficient.com/2024/04/09/discoveries-from-qa-with-enterprise-data-using-genai-for-oracle-autonomous-database/#respond Tue, 09 Apr 2024 12:55:53 +0000 https://blogs.perficient.com/?p=361387

Natural language AI has proliferated into many of today’s applications and platforms. One of the high in demand use cases is the ability to find quick answers to questions about what’s hidden within organizational data, such as operational, financial, or other enterprise type data. Therefore leveraging the latest advancements in the GenAI space together with enterprise data warehouses has valuable benefits. The SelectAI feature of the Oracle Autonomous Database (ADB) achieves this outcome. It eliminates the complexity of leveraging various Large Language AI Models (LLMs) from within the database itself. From an end user perspective, SelectAI is as easy as asking the question, without having to worry about GenAI prompt generation, data modeling, or LLM fine tuning.

In this post, I will summarize my findings on implementing ADB SelectAI and share some tips on what worked best and what to look out for when planning your implementation.

Several GenAI Models: Which One to Use?

What I like about SelectAI is that switching the underlying GenAI model is simple. This is important over time to stay up to date and take advantage of the latest and greatest of what LLMs have to offer and at the most suitable cost. We can also set up SelectAI with multiple LLMs simultaneously, for example, to cater to different user groups, at varying levels of service. In the future, there will always be a better LLM model to use, but at this time these findings are based on trials of the Oracle Cloud Infrastructure (OCI) shared Cohere Command model, the OpenAI GPT-3.5-Turbo model and the OpenAI GPT-4 model. Here is a summary of how each worked out:

Cohere Command:

While this model worked well for simple questions that are well phrased with nouns that relate to the metadata, it didn’t work well when the question got more complex. It didn’t give a wrong answer, as much as it returned a message as follows apologizing for the inability to generate an answer: “Sorry, unfortunately a valid SELECT statement could not be generated…”. At the time of this writing, the Command R+ model had just been introduced and became generally available, but it wasn’t attempted as part of this exercise. It remains to be found out how effective the newer R+ model is in comparison to the other ones.

OpenAI GPT-4:

This LLM worked a lot better than Cohere Command in that it answered all the questions that Command couldn’t. However, it comes at a higher cost.

OpenAI GPT-3.5-Turbo:

This one is my favorite so far as it also answered all the questions that Command couldn’t and is roughly 50 times less expensive than GPT-4. It is also a lot faster to respond compared to the OCI shared Cohere Command. There were some differences though at times in how the answers are presented. Below is an example of what I mean:

Sample Question: Compare sales for package size P between the Direct and Indirect Channels

Responses Generated by Each Model:

  • Cohere command: Sorry, unfortunately, a valid SELECT statement could not be generated
  • OpenAI gpt-3.5-turbo: This was able to generate a good result set based on the following query, but the results weren’t automatically grouped in a concise manner.
SELECT s.PROD_ID, s.AMOUNT_SOLD, s.QUANTITY_SOLD, s.CHANNEL_ID, p.PROD_PACK_SIZE, c.CHANNEL_CLASS
FROM ADW_USER.SALES_V s
JOIN ADW_USER.CHANNELS_V c ON s.CHANNEL_ID = c.CHANNEL_ID
JOIN ADW_USER.PRODUCTS_V p ON s.PROD_ID = p.PROD_ID
WHERE p.PROD_PACK_SIZE = 'P' AND c.CHANNEL_CLASS IN ('Direct', 'Indirect');
  • OpenAI gpt-4: This provided the best answer, and the results were most suitable with the question as it grouped by Channel Class to easily compare sales.
SELECT c.CHANNEL_CLASS AS Channel_Class, SUM(s.AMOUNT_SOLD) AS Total_Sales 
FROM ADW_USER.SALES_V s  
JOIN ADW_USER.PRODUCTS_V p ON s.PROD_ID = p.PROD_ID 
JOIN ADW_USER.CHANNELS_V c ON s.CHANNEL_ID = c.CHANNEL_ID 
WHERE  p.PROD_PACK_SIZE = 'P'AND c.CHANNEL_CLASS IN ('Direct', 'Indirect') 
GROUP BY c.CHANNEL_CLASS;

Despite this difference, most of the answers were similar between GPT-4 and GPT-3.5-Turbo and that’s why I recommend to start with the 3.5-Turbo and experiment with your schemas at minimal cost.

Another great aspect of the OpenAI GPT models is that they support conversational type questions to follow up in a thread-like manner. So, after I ask for total sales by region, I can do a follow up question in the same conversation and say for example, “keep only Americas”. The query gets updated to restrict previous results to my new request.

Tips on Preparing the Schema for GenAI Questions

No matter how highly intelligent you pick of an LLM model, the experience of using GenAI won’t be pleasant unless the database schemas are well-prepared for natural language. Thanks to the Autonomous Database SelectAI, we don’t have to worry about the metadata every time we ask a question. It is an upfront setup that is done and applies to all questions. Here are some schema prep tips that make a big difference in the overall data Q&A experience.

Selective Schema Objects:

Limit SelectAI to operate on the most relevant set of tables/views in your ADB. For example exclude any intermediate, temporary, or irrelevant tables and enable SelectAI on only the reporting-ready set of objects. This is important as SelectAI automatically generates the prompt with the schema information to send over to the LLM together with the question. Sending a metadata that excludes any unnecessary database objects, narrows down the focus for the LLM as it generates an answer.

Table/View Joins:

To result in correct joins between tables, name the join columns with the same name. For example, SALES.CHANNEL_ID = CHANNELS.CHANNEL_ID. Foreign key constraints and primary keys constraints don’t affect how tables are joined, at least at the time of writing this post. So we will need to rely on consistently naming join columns in the databases objects.

Create Database Views:

Creating database views are very useful for SelectAI in several ways.

  1. Views allow us to reference tables in other schemas so we can setup SelectAI on one schema that references objects in several other schemas.
  2. We can easily rename columns with a view to make them more meaningful for natural language processing.
  3. When creating a view, we can exclude unnecessary columns that don’t add value to SelectAI and limit the size of the LLM prompt at the same time.
  4. Rename columns in views so the joins are on identical column names.

Comments:

Adding comments makes a huge difference in how much more effective SelectAI is. Here are some tips on what to do with comments:

  • Comment on table/view level: Describe what type of information a table or view contains: For example, a view called “Demographics” may have a comment as follows: “Contains demographic information about customer education, household size, occupation, and years of residency”
  • Comment on column level: For security purposes SelectAI (in a non-Narrate mode) doesn’t send data over to the GenAI model. Only metadata is sent over. That means if a user asks a question about a specific data value, the LLM doesn’t have visibility where that exists in the database. To enhance the user experience where sending some data values to the LLM is not a security concern, include the important data values in the comment. This enables the LLM to know where that data is. For example, following is a comment on a column called COUNTRY_REGION: “region. some values are Asia, Africa, Oceania, Middle East, Europe, Americas”. Or for a channel column, a comment like the following can be useful by including channel values: “channel description. For example, tele sales, internet, catalog, partners”

Explain certain data values: Sometimes data values are coded and require translation. Following is an example of when this can be helpful: comment on column Products.VALID_FLAG: “indicates if a product is active. the value is A for active”

Is There a Better Way of Asking a Question?

While the aforementioned guidance is tailored for the upfront administrative setup of SelectAI, here are some tips for the SelectAI end user.

  • Use double quotations for data values consisting of multiple words: This is useful for example when we want to filter data on particular values such as a customer or product name. The quotation marks also help pass the right case sensitivity of a word. For example: what are the total sales for “Tele Sales” in “New York City”.
  • Add the phrase “case insensitive” at the end of your question to help find an answer. For example: “calculate sales for the partners channel case insensitive”. The SQL query condition generated in this case is: WHERE UPPER(c.CHANNEL_CLASS) = ‘PARTNERS’, which simply means ignore case sensitivity when looking for information about partners.
  • If the results are filtered, add a statement like the following at the end of the question to avoid unnecessary filters: “Don’t apply any filter condition”. This was more applicable with the cohere command model than the OpenAI models.
  • Starting the question with “query” instead of “what is”, for instance, worked better with the cohere command model.
  • Be field specific when possible: Instead of just asking for information by customer or by product, be more field specific such as “customer name” or “product category”.
  • Add additional instructions to your question: You can follow the main question with specific requests for example to filter or return the information. Here is an example of how this can be done:

“what is the average total sales by customer name in northern america grouped by customer. Only consider Direct sales and customers with over 3 years of residency and in farming. case insensitive.”

Results are returned based on the following automatically generates SQL query:

SELECT c.CUST_FIRST_NAME || ' ' || c.CUST_LAST_NAME AS CUSTOMER_NAME, AVG(s.AMOUNT_SOLD)
FROM ADW_USER.SALES_V s JOIN ADW_USER.CUSTOMERS_V c ON s.CUST_ID = c.CUST_ID
JOIN ADW_USER.COUNTRIES_V co ON c.COUNTRY_ID = co.COUNTRY_ID
JOIN ADW_USER.CHANNELS_V ch ON s.CHANNEL_ID = ch.CHANNEL_ID
JOIN ADW_USER.CUSTOMER_DEMOGRAPHICS_V cd ON c.CUST_ID = cd.CUST_ID
WHERE UPPER(co.COUNTRY_SUBREGION) = 'NORTHERN AMERICA'
AND UPPER(ch.CHANNEL_CLASS) = 'DIRECT'
AND cd.YRS_RESIDENCE > 3
AND UPPER(cd.OCCUPATION) = 'FARMING'
GROUP BY c.CUST_FIRST_NAME, c.CUST_LAST_NAME;

It’s impressive to see how GenAI can take the burden off the business in finding quick and timely answers to questions that may come up throughout the day, all without data security risks. Contact us if you’re looking to unlock the power of GenAI for your enterprise data.

]]>
https://blogs.perficient.com/2024/04/09/discoveries-from-qa-with-enterprise-data-using-genai-for-oracle-autonomous-database/feed/ 0 361387
Upcoming Oracle EPM Cloud Essbase Update https://blogs.perficient.com/2024/03/30/upcoming-oracle-epm-cloud-essbase-update/ https://blogs.perficient.com/2024/03/30/upcoming-oracle-epm-cloud-essbase-update/#respond Sat, 30 Mar 2024 17:15:33 +0000 https://blogs.perficient.com/?p=360855

Oracle EPM cloud customers who utilize solutions which using non-hybrid Essbase versions will be upgraded to hybrid version soon. According to Oracle’s March release notes, this upgrade will occur in the next 2 months. It’s important to note that this upgrade does not force non-hybrid configured cubes to now become hybrid. It’s simply updating the Essbase version within the application as part of Oracle’s effort to upgrade all environments to an Oracle Essbase version which supports hybrid cubes. Quoting Oracle “…if the application is configured to use non-hybrid cubes, this update will not change it to use hybrid cubes.” I know, I know.. Whew!

Now, if you’re not sure if your application is hybrid Admins can easily find this by checking the Activity Report in each application. If you find a “Yes” value in the Essbase Version supports Hybrid Block Storage Option field, then the application is hybrid enabled (see below image for where to find this information).

Essbasesupporthybrid

 

Non-hybrid Oracle EPM cloud customers who have concerns about potential issues – have no fear. Oracle provides a new utility to validate the outlines in your cubes. The utility can be found under the Actions menu in Application Overview. Select the “Pre validate Outline” option, then select Outline Pre-validation Report to view a list of incompatible member formulas. Customers with hybrid capable applications will not see this option.

This upgrade applies to the following application types: Financial Consolidation and Close, FreeForm, Planning, Planning Modules, Tax Reporting

Have questions about this update or need help with fixing incompatible formulas – contact us today! We’d love to help!

 

 

]]>
https://blogs.perficient.com/2024/03/30/upcoming-oracle-epm-cloud-essbase-update/feed/ 0 360855
Important Update for Oracle EPM Cloud https://blogs.perficient.com/2024/03/29/daily-maintenance-update-for-oracle-epm-cloud-solutions/ https://blogs.perficient.com/2024/03/29/daily-maintenance-update-for-oracle-epm-cloud-solutions/#respond Fri, 29 Mar 2024 23:45:27 +0000 https://blogs.perficient.com/?p=360850

Oracle’s release notes for the March update includes a very important feature for Oracle EPM Cloud customers. The Daily Maintenance can now be configured to any minute within any hour (i.e. HH:MM format). Previously, the window could only be configured to start at the top of the hour. This allows customers to have more control over the maintenance processes. See below for an example of the this new format in use.

Dailymaintupdate

Additionally, the release notes include the following notice…

“Starting with the April Update (24.04), the daily maintenance start time of EPM Cloud environments will be updated by assigning a random minute at which the maintenance process will start. Currently, the daily maintenance starts at the top of the hour, for example at 2:00 p.m. UTC (14.00 on a 24 hour clock). This change will modify the start time by adding a value for minute (for example, 2:24 p.m. UTC (14.24 on a 24 hour clock) so that the daily maintenance load on EPM Cloud infrastructure is distributed throughout the hour.”

The start time can be changed by Oracle EPM cloud Admins after the update is applied. Therefore, we recommend that customers review their Daily Maintenance settings for all EPM applications after the April updates have been applied.

 

]]>
https://blogs.perficient.com/2024/03/29/daily-maintenance-update-for-oracle-epm-cloud-solutions/feed/ 0 360850
Top 3 New Features for Oracle EDM https://blogs.perficient.com/2024/03/27/top-3-new-features-for-oracle-edm/ https://blogs.perficient.com/2024/03/27/top-3-new-features-for-oracle-edm/#respond Thu, 28 Mar 2024 01:00:31 +0000 https://blogs.perficient.com/?p=360500

Oracle’s product development team for EDM (Enterprise Data Management) continues to provide highly valuable new features each quarter, and the release earlier this month is no different. With 30 enhancements on the list all providing immense benefits to customers, it’s a challenge to narrow it down to the top 3. However, after some debate by our Oracle EDM specialists, we were able to agree on the following!

 

Node Name Calculation

This feature is easily the one that sticks out. The inability to auto-populate node names for simple uses cases (such as using the next logical sequential number for Accounts) has been an annoyance. Not to mention a huge time waste for many users. I personally can recall numerous clients expressing grief about having to rely on manual intervention for something so trivial. Can’t say I don’t agree with them. But that changes this month!

Now, node names can now be automatically generated. The automation can be configured based on node properties or hierarchical relationships (such as using the name of a parent node to derive the name). Customers who rely heavily on custom validations to enforce naming standards can now use this new feature to further enhance the user experience by removing manual (and many times frustrating) involvement in simply naming nodes.

 

Validation Severity Enhance Workflows

Not every violation of a validation rule can be resolved by users submitting a request. This could be due to users not having enough information, or it may be sensitive information for which they are not privy to based on their role. However, for many Oracle EDM customers these situations should not deter the users request from advancing. For these situations, validations can now be configured as Warnings. The user is still notified of the issue, but the request can be submitted and the violation can be resolved in later workflow stages. Additionally, the validations can be outright ignored for certain workflow stages if needed.

Providing the flexibility of how/when validations are enforced greatly enhances the user experience. We are excited to see customers using this new feature in the coming months & years!

 

Select Multiple Nodes for Moves and Inserts

Starting this month users can now select multiple nodes when performing Insert or Move. Nodes can be from a single parent or multiple parents. This removes many many “clicks of the mouse” to perform mass changes. Plus, the feature enhances the user experience while leading to an increase in efficiency.

 

Want to learn more about Oracle EDM’s latest features, or have questions about the solution’s capabilities? Contact us or leave a comment and we’ll be happy to answer!

 

 

 

]]>
https://blogs.perficient.com/2024/03/27/top-3-new-features-for-oracle-edm/feed/ 0 360500
End of Year Enhancements for Oracle Enterprise Data Management (2023) https://blogs.perficient.com/2023/12/26/enhancements-for-oracle-enterprise-data-management/ https://blogs.perficient.com/2023/12/26/enhancements-for-oracle-enterprise-data-management/#respond Wed, 27 Dec 2023 03:00:54 +0000 https://blogs.perficient.com/?p=352395

Earlier this month Oracle released a long list of new features for EDM (Oracle Enterprise Data Management). The enhancements address several areas that great improve the tool for both administrators and business users. Follow below as we take a look at a few of the features which we believe are most beneficial.

 

Customized Navigation and Information

Since the initial release of EDM, the home page has contained been the same layout and options for both administrators and business users. But now, Oracle is providing the ability to customize the home page to specific users to help reduce confusion and streamline efficiency in navigation. This now puts EDM on par with the other Oracle EPM cloud offerings in regards to tailoring the user experience. Existing customers should note the presence of a new card named “Information Model” that will now contain the Node Types, Hierarchy Sets, Node Sets, Properties and Lookup Sets cards within the default layout.

Additionally, EDM now provides Announcements to be displayed on the home page. Administrators can create announcements to communicate various information to the user community. Users view announcements from the welcome panel on the home page. Announcements can include hyperlinks and can be effective dated as well.

Edm Homepagenew

 

Oracle Financials Cloud General Ledger Integration Processing Gains

The pre-built integration with Financials GL now allows for only the specific trees and tree versions contained in EDM to selected for processing . This results in for a reduction in processing time for Financials GL customers with multiple tress/tree versions.  The ‘Process Segment Values and Hierarchies’ task and the ‘Publish Hierarchies to Cube’ task are limited to process only the trees/tree versions managed in EDM (export options in EDM control these tasks).

 

 

Test of Expressions Reduces Time-to-Build

The ability to customize properties and validations within EDM through derived expressions is a major benefit to all EDM customers. However, unit testing the expressions has always required the developer to navigate away from the expression builder screen and view the data. But that is no longer the case! EDM now has the ability to test the expression directly in the expression builder. Talk to any EDM developer and they will undoubtedly give you an earful of just how much this streamlines their ability to build custom derive properties and validations.

In the image below, the Expression Tester tab can be viewed on the right. The result of the expression is displayed to the right of the ‘Evaluate’ button at the bottom.

Edm Testexpression

]]>
https://blogs.perficient.com/2023/12/26/enhancements-for-oracle-enterprise-data-management/feed/ 0 352395
What’s so scary about AI for HR? https://blogs.perficient.com/2023/10/31/whats-so-scary-about-ai-for-hr/ https://blogs.perficient.com/2023/10/31/whats-so-scary-about-ai-for-hr/#respond Tue, 31 Oct 2023 14:57:15 +0000 https://blogs.perficient.com/?p=348223

AI in Oracle CloudWorld, Performance Reviews and Oracle Grow

Oracle CloudWorld and AI introduction for HCM

At Oracle’s CloudWorld last month, the dominant theme was about Artificial Intelligence (AI).  With all of the new lingo, one can either feel overwhelmed or have better Rizz in meetings if know the latest buzz words such as Generative AI, Machine Learning (ML), Large Language Model (LLM) or Natural Language (NL).  And to go old school, I can even think of a few AI movies via ‘Wargames’ when the AI computer played chess or Will Smith battling the robot in ‘AI’.  Should we be scared of the possibilities or embrace the new changes?

Man And Robot Write Together

Well, thankfully, Oracle has made AI a pleasant additive in the HCM module.  The initial use of AI will focus on Authoring, Insights/Planning, Suggestions and Summarization.  Specifically, I am calling out two areas where AI can really enhance the HCM modules: Performance Reviews and Oracle Grow.

Performance Reviews and AI

The first one involves writing performance reviews or the Summarization in the Performance Management module.  As a  manager, performance reviews require the analytical thinker to put together some elegant wording for each of their direct reports.  And if the company requires a review on a semi-annual basis, the process can be even more cumbersome versus once a year.

This is where AI comes into play using the term ‘Generative AI’.  What this means is that based on the numeric ratings for each category plus any feedback from peers or dotted line managers, all you need to do is click a button to ‘generate’ a performance review within the HCM module.  The output is a nice summary that will involve very little edits, kind of like ChatGPT.  Now the analytical thinker can go back to their analysis or playing chess.  What a relief!

Oracle Ai And Performance Reviews

 

Oracle Grow and AI

The second AI feature for Oracle Cloud HCM is within Oracle ME, specifically, Oracle Grow which acts as a development coach by making Suggestions in the areas of learning, skill development and career mobility.  Now these Suggestions become more intuitive and meaningful with the AI engine.  This results in employees becoming more engaged with their career and company.  What more can you ask of AI to the HR world?  And it helps that Oracle Grow was the 2023 Top HR Product of the Year given by Human Resource Executive.

Ai And Oracle Me

Oracle Grow And Ai

 

Just for a teaser, here are a few other notable AI wins on the HCM horizon.  For Summarization, how about candidate qualification or pivot to Authoring related to Goal Creation, Job Requisition/Description or Anytime Feedback Writing Support.  So AI in Oracle Cloud HCM does not seem so scary but actually will allow us to increase our time in value added tasks.  Sounds like a treat to me!

Contact us to modernize your HR System with Oracle Cloud HCM for a full assessment or even an existing Oracle Cloud HCM check-up at Perficient’s Oracle Cloud HCM practice. 

]]>
https://blogs.perficient.com/2023/10/31/whats-so-scary-about-ai-for-hr/feed/ 0 348223
Simplify Your Workflow: Streamlining Business Processes with EPM Pipeline Scheduling https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/ https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/#respond Thu, 19 Oct 2023 14:47:02 +0000 https://blogs.perficient.com/?p=345018

If you have worked with EPM Pipelines, you know how incredibly relevant it is for customers who do not have the resources for data integration automation. It has been a saving grace for streamlining tasks and data load processes within the EPM application itself. However, one important element that had been missing from it was a comprehensive scheduling option. Many of us found ourselves eagerly waiting for Oracle to introduce a scheduling option that could give way to complete automation. In the September 2023 update for the Freeform Planning and Planning modules, Oracle added a Pipeline job type to the Jobs Scheduler making data load automation possible.  

 

Here’s how to go about it: 

 

After creating a Pipeline process for Data Integration in Planning Data Exchange (more on this in my recent blog), you can schedule it as an Integration Pipeline job. Here’s how I scheduled my Data Load Pipeline process to run daily and perform multiple actions in a sequence.  

 

To schedule an Integration Pipeline job

  • In your Freeform, Planning, or Planning Modules application, click ‘Jobs‘ under ‘Application‘.

E1

  • Click ‘Schedule Jobs‘.

E2

  • A new Job type is available: Integration Pipeline.

E2

  • Define a daily schedule for the Job, provide a name and click ‘Next‘.

E4

  • Select the Integration Pipeline that you want to schedule.

E6

  • Select the parameters required for the Pipeline in this schedule. Click ‘Ok‘.

E7

  • Click ‘Next‘.

E8

  • Review the scheduled job and click ‘Finish‘.

E9

  • The scheduled Pipeline is ready to go!

E10

And there you have it. Not only can you create a Pipeline with multiple types of actions required to maintain your application, but also schedule that entire process to run in an automated fashion.

 

]]>
https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/feed/ 0 345018
Oracle’s Updates Cloud EPM Data Integration UI https://blogs.perficient.com/2023/09/29/oracles-data-integration-update/ https://blogs.perficient.com/2023/09/29/oracles-data-integration-update/#respond Sat, 30 Sep 2023 01:30:02 +0000 https://blogs.perficient.com/?p=345852

If you’ve been following Oracle’s monthly updates for Cloud EPM, then you are aware that they had previously announced that the Data Management user interface was slowly being replaced with the newer Oracle Data Integration screen (found under the Data Exchange tile). The move does not mean the deprecation of functionality overall, but rather an enhancement to the user experience overall. The Data Management UI is largely a tribute to the on-prem equivalents. And to be honest, looks just as antiquated… and boring. Yes, I know I can hear all of you long time Oracle EPM techies yelling… it works, why mess with it?!  But in response, I’ll say what I was once told by an old girlfriend …. looks matter. Or maybe a saying from the sports world is better fitting – “look good, feel good, play good”. Jokes aside, the newer interface is better and is much more intuitive for users in my opinion. And September’s Cloud update contains a few key items to note regarding this change.

First, the following Data Management screens are no longer available. And are now accessible via the corresponding Data Integration page.

Datamgmt Dataintscreens

Dataintegrationscreen

 

Second, the update notes state that all Data Management features will eventually move to Data Integration except the following:

  • The batch feature will be replaced by the new Pipeline feature. The Pipeline feature was available in the June (23.06) update.
  • The Report Definition feature will not be migrated, only the Report Execution feature.
  • The ability to create new Custom Applications in Data Management will no longer available, and customers should use the “Data Export to File” application type instead.

Regarding the last bullet, for customers still using custom target applications, Oracle recommends using the Upgrade Custom Applications option to migrate the custom target application to a data export to file application. Additional documentation on topic can be found here. But don’t worry, this change doesn’t need to happen immediately as “Existing integrations that use a custom target application will not be impacted, and will still run without any changes” according to the release notes.

 

Are you already using Oracle’s Data Integration interface? Have you familiarized yourself with the new interface? Or have questions about it? Comment and let us know!

]]>
https://blogs.perficient.com/2023/09/29/oracles-data-integration-update/feed/ 0 345852
Creating Controls on Process Execution using Pipelines Continue-Stop Options https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/ https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/#respond Thu, 28 Sep 2023 14:12:10 +0000 https://blogs.perficient.com/?p=344761

You can create a series of stages in an EPM pipeline, where each stage contains one or more jobs that perform actions like data loads, rule executions, file operations etc. By organizing these jobs and stages in a single Pipeline, EPM customers can streamline their business processes into executable “workflows”.

 

But what if one of the jobs fails causing the corresponding stage to fail? There could be several reasons for that to happen – a connection issue, invalid credentials, or a data issue. Whatever the reason is for a failure in a Pipeline, the whole workflow comes to a stop. Now this may be exactly what needs to happen. If the first step of the process has an issue, the user may not want the process to proceed. Conversely, the process may still need to proceed, despite a failure at any stage in the Pipeline. There may be another case where the user just wants to bypass a certain step in the process regardless of whether it is going to be a success or a failure.

 

The good news is that there is a way to design your Pipeline to work for any and all of these scenarios! You can use “Continue” and “Stop” options while creating the stages in your Pipeline to determine how the Pipeline process should progress.

 

These options are available only in Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Planning Modules, Tax Reporting as of September 2023.

 

Using Pipeline Execution Options

To create execution controls on the stages of your Pipeline

  • From the Data Integration home page under Data Exchange, click D1 to the right of the Pipeline you want to edit, and then select “Pipeline Details“.

D2

  • Click on the Stage in your Pipeline to which you want to apply execution controls. The Editor box opens on the right. The first job in the below-selected Stage is ‘RPT Actuals Load’.

D3

  • You will see two new options: “On Success” and “On Failure”. You can define how the Pipeline should progress when the current Stage fails and when it is a success. Click “On Success”.

D4

  • In my case, if the ERP Actuals load to the RPT cube of my Planning application is a success, I can do one of the following: a) Continue to the next Job in this Stage i.e., ‘FS Actuals Load’, b) Stop the Pipeline execution at this point, c) Skip to the next Stage in the Pipeline i.e., ‘Actuals Copy to Forecast’, or d) Skip to the last Stage in the Pipeline i.e., ‘Calculate Balance Sheet and CF’.
  • Make the required selection. Then click “On Failure”.

D5

  • If the Actuals load to RPT fails, you can do one of the following: a) continue to load Actuals to the FS cube i.e., ‘FS Actuals Load’, b) Stop the Pipeline Execution at this point, c) Skip the Actuals load Stage altogether and proceed to the data copy Stage i.e., ‘Actuals Copy to Forecast”, d) Skip to the last Stage in the Pipeline ‘Calculate Forecast Balance Sheet & CF’.
  • This way when you have a failure in any stage of your Pipeline, you can still choose to perform other actions in your business process that are not impacted by the failed step, or you can also redirect the Pipeline execution to a “cleanup” step like notifying Administrators of the failure via an e-mail.
  • Make the required “On Failure” selection. The Pipeline is auto-saved.

D6

Using these Execution controls will give users of EPM Pipelines incredible flexibility in managing their business processes without having to use complex scripting our external tools.

]]>
https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/feed/ 0 344761