Data Strategy Articles / Blogs / Perficient https://blogs.perficient.com/tag/data-strategy/ Expert Digital Insights Mon, 13 Oct 2025 15:16:38 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Data Strategy Articles / Blogs / Perficient https://blogs.perficient.com/tag/data-strategy/ 32 32 30508587 John Vylasek Translates Complexity into Impact Across Perficient’s Data Practice https://blogs.perficient.com/2025/10/08/john-vylasek-translates-complexity-into-impact-across-perficients-data-practice/ https://blogs.perficient.com/2025/10/08/john-vylasek-translates-complexity-into-impact-across-perficients-data-practice/#respond Wed, 08 Oct 2025 14:52:56 +0000 https://blogs.perficient.com/?p=387715

Perficient’s vibrant culture is fueled by leaders who bring sharp thinking, deep expertise, and a collaborative spirit to their day-to-day work. We recently connected with John Vylasek, senior solution architect, whose journey from military intelligence and commodity trading to client strategy has shaped his bold, visionary approach to problem-solving.  

John currently leads strategic data efforts at Perficient, using AI, analytics, and deep business knowledge to help clients drive impact. In this People of Perficient profile, we’ll explore how John’s diverse background and passion for innovation empower Perficient to deliver transformative, AI-first solutions with purpose and impact. 

What is your role? Describe a typical day in the life.    

As a data strategist, I help solve a wide range of client challenges. My day usually starts with reviewing my plan from the previous day while also checking emails and Teams for any new adjustments. I’m currently engaged full-time as Data Delivery Director with a major global financial services client, where things can shift quickly due to regulatory and business requirement changes. Thankfully, my team is always responsive and able to adapt as needed, which makes Perficient a great strategic partner. 

How did your experience in the military shape your approach to leadership? 

John 2

My time in military intelligence taught me how to break down complex problems, communicate clearly up the chain of command, and lead with both confidence and humility. I was asked at the age of 19 to assemble a team of intelligence analysts with the skills required to tackle an important and ambiguous challenge. We developed effective new techniques, and just months before Operation Desert Storm began, we applied our solutions to great effect. That experience taught me the value of building the right team, being bold yet respectful, and focusing on meaningful impact. One of the biggest lessons I carry is to “just make it happen”—a mindset I apply daily, whether with clients or internal teams. 

What brought you to Perficient, and how do your past experiences align with your current role? 

I joined Perficient while consulting independently, thanks to a referral, and it just clicked. My background in tech, leadership, analytics, and decomplicating ambiguity makes data strategy a great fit. I’m still coding, diagramming, translating complexity into clarity, and diving headfirst into new challenges. The consulting side was new to me at this scale, but because I’ve solved similar problems before, I adjusted quickly. I love helping clients align, adapt, and problem-solve with confidence, and I get genuinely excited when a new curveball comes my way. 

Whether big or small, how do you make a difference for our clients, colleagues, communities, or teams? 

I make a difference by asking many questions. It might seem small, but it helps the entire team learn and speak up. I often hear, “I was wondering the same thing,” which creates a more open, collaborative environment. I also love sharing resources that have helped me grow, like Cloud Data Architectures Demystified or helpful Udemy courses. As a lifelong learner, I’m always generous when passing along what’s worked for me. 

READ MORE: Hear From Our Colleagues About How They Prioritize Learning and Development 

Whether it’s creating a quick diagram during a meeting or reframing a problem in clearer terms, I break things down into understandable chunks so people can understand our goals and act on next steps with confidence. I take the same approach in client discovery—asking leading questions, listening closely, and creating a safe space for open and transparent communication. Being humble and approachable makes it easier for others to do the same, and that’s where real progress happens. 

What was the most rewarding part about serving as a mentor in the Mark Cuban AI Bootcamp? 

John

The most rewarding part of mentoring in the Mark Cuban AI Bootcamp was watching a group of bright high school students come together, collaborate, and build something meaningful. With some light coaching on inclusion and teamwork, they quickly aligned and focused on a shared goal: using AI to help people in the physical world.  

We developed a gesture recognition model to help people with communication challenges express specific needs through custom-trained motions, even when away from familiar caregivers. It was powerful to see how that idea took shape through trial, iteration, and collaboration. 

By the end of the program, not only did our model work, but my students also won the final “Shark Tank” pitch for the most impactful idea. One of them said, “We never would have gotten here if everyone didn’t think differently,” which really stuck with me. That diversity of thought, and the chance to help guide it, was incredibly rewarding. Hearing later that one student had highlighted her experience working with me as a key takeaway from the program made it even more meaningful. 

READ MORE: Perficient’s Award-Winning Partnership With the Mark Cuban Foundation 

What advice would you give to colleagues who are just starting their career with Perficient? John 5

My first piece of advice is simple: handle your basics. Get your timesheets in, complete your training, and manage your time like a professional while always learning and improving yourself. It may seem small and obvious, but it sets the tone for everything else you do. 

Second, understand that how you show up internally at Perficient may need to be different than how you show up with a client. With clients, you lead through collaboration and patience, bringing people along at their own pace. Internally, especially during pursuits, business moves fast, and it helps to be more direct and decisive. Know your role, understand who’s leading, and stay aligned. If you have a concern, think about whether it’s the right time to raise it. I’ve coached and mentored people on this—sometimes it’s better to hold off on a small technical rabbit hole detail rather than disrupt momentum when the group is already aligned. I’m still learning and adapting myself, but this distinction has been key to working effectively. 

Why are you #ProudlyPerficient?    

John 6

I’m #ProudlyPerficient because I get to work alongside sharp, highly adaptive people who are always ready to dive in and get things done. Internally, there’s a fast pace and a bias toward action, so I’ve learned that often you need to step up, assign roles, and lead decisively. It’s not about being the loudest voice in the room; it’s about clarity, support, and knowing when to speak up and when to stay focused on the goal. That kind of teamwork and trust is what gets stuff done. 

I appreciate how differently we show up for clients— collaborative, patient, meeting them where they are. That flexibility between both modes while staying grounded in the work is what makes Perficient special. We’re not just delivering solutions; we’re building alignment. I’m proud to be a part of that. 

How has collaborating with our global teams shaped your growth journey at Perficient?
I’ve been leading global teams for many years, and the approach is consistent—find the people who make the extra effort to communicate, align, and get things done. Whether they’re in Latin America, India, or elsewhere, those relationships are what get “it” done. Building that network, finding your go-to experts, and recognizing talent across borders have been the most rewarding parts of my journey. 

LEARN MORE: Perficient’s Global Footprint Enables Genuine Connection and Collaboration 

How does staying up to date with evolving technologies help you better serve clients?   

One of my goals is to deepen my understanding of how to use local large language models (LLMs) in secure, practical ways. It’s an incredible accelerator for learning and staying up to speed. With a manufacturing client, I used an LLM to help map two complex database schemas. By feeding in just the field names and a few sample rows, the model was able to do most of the heavy lifting in identifying how the old system aligned with the new one. It wasn’t perfect, but it saved a lot of time and gave us a strong head start. Continuing to explore how AI can support data strategy and problem-solving is a key part of my growth path. 

What does being an AI-first company mean to you?
To me, being an AI-first company means starting from a place where AI is always considered not only as a solution itself but also as an accelerator to identify the solution and the steps to get there. It is a new way of thinking that saves us time and our clients’ money.  

 At Perficient, we approach AI with purpose and lead conversations when it makes sense to lead with it and when it does not. That level of thoughtfulness is part of what sets us apart. 

LEARN MORE: How We Are Building an AI-First Enterprise 

What are you passionate about outside of work?   

John 4

Outside of work, I spend a lot of time with my family. My oldest son lives just four doors down, and we’re often outside fishing with the grandkids. I also support my wife, who went from being a stay-at-home mom to now serving as Dean of the School of Health Sciences over several programs. I’m the primary cook at home and like making healthy meals. I’m also into photography, especially aurora and space photography. I’ve been able to get some great shots even from our backyard in the city. Staying active is also a big focus of mine, so we spend a lot of time out in nature. 

SEE MORE PEOPLE OF PERFICIENT  

It’s no secret our success is because of our people. No matter the technology or time zone, our colleagues are committed to delivering innovative, end-to-end digital solutions for the world’s biggest brands, and we bring a collaborative spirit to every interaction. We’re always seeking the best and brightest to work with us. Join our team and experience a culture that challenges, champions, and celebrates our people.  

Learn more about what it’s like to work at Perficient at our Careers page. See open jobs or join our talent community for career tips, job openings, company updates, and more!  

Go inside Life at Perficient and connect with us on LinkedIn, YouTube, X, Facebook, and Instagram. 

]]>
https://blogs.perficient.com/2025/10/08/john-vylasek-translates-complexity-into-impact-across-perficients-data-practice/feed/ 0 387715
Exploring the Free Edition of Databricks: A Risk-Free Approach to Enterprise AI https://blogs.perficient.com/2025/06/24/explore-databricks-free-edition-risk-free-analytics/ https://blogs.perficient.com/2025/06/24/explore-databricks-free-edition-risk-free-analytics/#respond Tue, 24 Jun 2025 20:53:39 +0000 https://blogs.perficient.com/?p=383411

Databricks announced a full, free version of the platform at the Data and AI Summit. While the Free Edition is targeted to students and hobbyists, I also see opportunities where enterprise architects can effectively evangelize Databricks without going through Procurement for a license. Choosing the right platform to manage, analyze, and extract insights from massive datasets is crucial, especially with new and emerging GenAI use cases. We have seen many clients paralyzed by the combination of moving to a cloud database, comparing and contrasting the different offerings, and doing all of this analysis with only a very murky picture of what the new AI-driven future holds. The Community Edition has always been free, but it has not been feature-complete. With its new Free Edition, Databricks presents an exceptional opportunity for organizations to test its capabilities with no financial commitment or risk.

What is Databricks Free Edition?

The Free Edition of Databricks is designed to provide users with full access to Databricks’ core functionalities, allowing them to explore, experiment, and evaluate the platform’s potential without any initial investment. This edition is an excellent entry point for organizations looking to understand how Databricks can fit into their data strategy, providing a hands-on experience with the platform’s features.

Key Features of Databricks Free Edition

  1. Simplified Setup and Onboarding: The Free Edition offers a straightforward setup process. Users can easily create an account and start exploring Databricks’ environment in a matter of minutes. This ease of access is ideal for decision-makers who want to quickly assess Databricks’ capabilities.
  2. Complete Workspace Experience: Users of the Free Edition get access to a complete workspace, which includes all the necessary tools for data engineering, data science, and machine learning. This enables organizations to evaluate the entire data lifecycle on the Databricks platform.
  3. Scalability and Performance: While the Free Edition is designed for evaluation purposes, it still provides a glimpse into the scalability and performance efficiency that Databricks is known for. Organizations can run small-scale analytics and machine learning tests to gauge how the platform handles data processing and computation tasks.
  4. Community Support and Resources: Users can benefit from the extensive Databricks community, which offers support, tutorials, and shared resources. This can be particularly valuable for organizations exploring Databricks for the first time and wanting to leverage shared knowledge.
  5. No Time Constraints: Unlike typical trial versions, the Free Edition does not impose a time limit, allowing organizations to explore the platform at their own pace. This flexibility is essential for CIOs and CDOs who might need extended periods to evaluate the platform’s potential fully.

Benefits for CIOs and CDOs

  1. Risk-Free Evaluation: The primary advantage of the Free Edition is the risk-free nature of the exploration. CIOs and CDOs can test the platform’s capabilities without signing contracts or making financial commitments, aligning with their careful budget management strategies.
  2. Strategic Insights for Data Strategy: By exploring Databricks firsthand, decision-makers can gain strategic insights into how the platform integrates with existing systems and processes. This understanding is crucial when considering a transition to a new data analytics platform.
  3. Hands-On Experience: Direct interaction with Databricks helps bridge the gap between executive strategy and technical implementation. By experiencing the platform themselves, developers and architects can better champion its adoption across the organization.
  4. Pre-Deployment Testing: The Free Edition enables organizations to test specific use cases and data workflows, helping identify any challenges or concerns before full deployment. This pre-deployment testing ensures that any transition to Databricks is smooth and well-informed.
  5. Benchmarking Against Other Solutions: As organizations evaluate various data platforms, the Free Edition allows Databricks to be benchmarked against other solutions in the market. This comparison can be crucial in making informed decisions that align with long-term strategic goals.

Maximizing the Use of Databricks Free Edition

To maximize the benefits of Databricks Free Edition, CIOs and CDOs should consider the following strategies:

  • Define Use Cases: Before diving into the platform, define specific use cases you want to test. This could include data processing efficiency, machine learning model training, or real-time analytics capabilities. Clear objectives will provide focus and measurable outcomes.
  • Leverage Community Resources: Engage with the Databricks community to explore case studies, tutorials, and shared solutions that can offer fresh perspectives and innovative ideas.
  • Collaborate with Data Teams: Involve your data engineering and science teams early in the evaluation process. Their input and expertise will be invaluable in testing and providing feedback on the platform’s performance.
  • Evaluate Integration Points: During your exploration, assess how well Databricks integrates with existing systems and cloud services within your organization. Seamless integration is vital for minimizing disruption and maximizing workflow efficiency.

Conclusion

The Databricks Free Edition is an invaluable opportunity for CIOs and CDOs to explore the transformative potential of big data analytics on a leading platform without any associated risks.

Contact us to learn more about how to empower your teams with the right tools, processes, and training to unlock Databricks’ full potential across your enterprise.

]]>
https://blogs.perficient.com/2025/06/24/explore-databricks-free-edition-risk-free-analytics/feed/ 0 383411
Streamline Your PIM Strategy: Key Techniques for Effective inriver Integration https://blogs.perficient.com/2024/11/13/streamline-your-pim-strategy-key-techniques-for-effective-inriver-integration/ https://blogs.perficient.com/2024/11/13/streamline-your-pim-strategy-key-techniques-for-effective-inriver-integration/#comments Thu, 14 Nov 2024 05:03:27 +0000 https://blogs.perficient.com/?p=370634

In today’s digital landscape, efficiently managing product information is vital for businesses to enhance customer satisfaction and drive sales growth. A robust Product Information Management (PIM) system with excellent integration features, like inriver, will streamline your PIM strategy. By utilizing the integration frameworks and APIs provided by inriver, businesses can ensure relevant, accurate, and consistent product information across all channels. This article explores key inriver integration techniques that have the potential to transform your PIM approach.

The importance of PIM Integration

Automating PIM processes leads to significant improvements in efficiency, accuracy, and scalability. By eliminating manual data entry, automated integration reduces errors and ensures that information remains consistent and current across all systems. This not only saves time and cuts labor costs but also enhances business agility and customer satisfaction. With automated integration, companies can swiftly adapt to market changes, make informed decisions, and provide timely, personalized information to their customers.

Streamline the PIM process

Exploring inriver Integration Options

There are several ways to automate the integration between systems that are used to send or receive data –

Leveraging APIs (application programming interface)-

  • inriver REST APIs – These can be utilized to build integrations in any programming language and customize interfaces within inriver, including creating enriched PDF/Preview templates.
  • inriver Remoting APIs – These require C# programming knowledge and are used with hosted solutions. The Remoting API services consist of six major components:
    • Channel Service – Methods related to channels. e.g. Channel Structure, Publish/Unpublish a channel, Retrieve entities and links from a channel etc.
    • Data Service – One of the most widely used Service for creating, updating, deleting and finding entities and links in the system.
    • Model Service – Contains methods for building and maintaining PIM data model.
    • Print Service – Used for developing the inriver print plugin.
    • User Service – Provide methods for maintaining uses, role, permissions and restrictions.
    • Utility Service – Contains various method including Connector states, HTML Templates, Languages, and Notifications.

Remoting Services

Remoting Services

  • Content API  – A set of APIs designed to facilitate the onboarding and distribution of large volume of product data.
    • Content Onboarding API – help standardize the data onboarding process by dividing them into five key steps – Landing Area, Field Mapping, Staging area, PIM validations and Import.
    • Content Delivery API – used for distribution of product data to various channels and platforms, ensures that product data is uniform across all channels.

Integration Framework (IIF) – The Integration Framework is a foundation for building adapters and outbound integrations in inriver. It transforms customer’s unique data model into a standard integration model. It supports custom entity types, delta functionality and provide standard functions to deliver product data.

High level integration framework flow

High level integration framework flow

The following table highlights the key aspects when considering integration within inriver –

Feature/Aspect REST API Remoting API inriver Integration Framework (IIF) Content API
Functionality Basic to advance functionality Extensive functionality Outbound integrations Build on IIF, Standardizes inbound and outbound data handling
Programming Language Technology-agnostic Requires C# programming Requires C# programming Technology-agnostic
Use Cases Remote solutions Hosted solutions, advanced operations Exporting data to storefronts, building adapters Onboarding product data, distributing product data
Performance Better performance for remote solutions Better performance for hosted solutions Efficient for outbound data handling Efficient for both inbound and outbound data handling
Flexibility High flexibility, suitable for various platforms Less flexible, specific to inriver environment Moderate flexibility, decouples standard adapters High flexibility, suitable for various platforms
Scalability Highly scalable Scalable within inriver cloud service Scalable for outbound integrations Highly scalable
Common Applications eCommerce platforms, CMS, BI tools ERP systems, custom extensions eCommerce platforms, Marketplaces Supplier onboarding, ERP, content distribution

 

These integration techniques can significantly enhance your PIM strategy, ensuring your product data remains accurate, consistent, and up to date across all channels. At Perficient, we engage in comprehensive discussions throughout our elaboration process and continue to validate during implementation phase. We help finalize best practices tailored to each customer’s unique needs, recognizing that one approach may work better for one client than another. Get in touch to explore how we can support you on your PIM implementation journey, whether you’re starting fresh or facing challenges with an existing system.

]]>
https://blogs.perficient.com/2024/11/13/streamline-your-pim-strategy-key-techniques-for-effective-inriver-integration/feed/ 2 370634
Risk Management Data Strategy – Insights from an Inquisitive Overseer https://blogs.perficient.com/2024/08/19/risk-management-data-strategy/ https://blogs.perficient.com/2024/08/19/risk-management-data-strategy/#comments Mon, 19 Aug 2024 14:47:52 +0000 https://blogs.perficient.com/?p=367560

We are witnessing a sea-change in the way data is managed by banks and financial institutions all over the world. Data being commoditized and, in some cases, even monetized by banks is the order of the day. Though this seems to be at a stage where some more push is required in terms of adoption in the risk management function. Traditional risk managers, by their job definition, are highly cautious of the result sets provided by the analytics teams. I have even heard the phrase “Please check the report, I don’t understand the models and hence trust the number”.

So, in the risk function, while this is a race for data aggregation, structured data, unstructured data, data quality, data granularity, news feeds, market overviews, its also a challenge from an acceptance perspective. The vision is that all of the data can be aggregated, harmonized and used for better, faster and more informed decision making for Financial and Non Financial Risk Management. The interdependencies between the risks were factors that were not considered in the “Good Old Days” of risk management (pun intended).

Based on my experience, here are the common issues that are faced by banks running a risk of not having a good risk data strategy.

1. The IT-Business tussle (“YOU don’t know what YOU are doing”)

This according to me is the biggest challenge facing traditional banks, especially in the risk function. “The Business”, in traditional banks, is treated like a larger-than-life entity that needs to be supported by IT. This notion of IT being the service provider, whilst business is the “bread-earner”, especially in the traditional banks’ risk departments; does not hold good anymore. It has been proven time and again that the two cannot function without each other and that’s what needs to be cultivated as a management mindset for strategic data management effort as well. This is a culture change, but it’s happening slowly and will have to be adapted industry-wide. It has been proven that the financial institutions with the most organized data have a significant market advantage.

2. Data Overload (“Dude! where’s my Insight”)

The primary goal of data management, sourcing and aggregation effort will have to be converting data into informational insights. The team analyzing the data warehouses, the data lakes and aiding the analytics will have to have this one major organizational goal in mind. Banks have silos, these silos have been created due to mergers, regulations, entities, risk types, chinese walls, data protection, land laws or sometimes just technological challenges over time. The solution to most this is to start with a clean slate. The management mandate for getting the right people to talk and be vested in this change is crucial, challenging but crucial. Good old analysis techniques and brain storming sessions for weeding out what is unnecessary and getting the right set of elements is the key. This needs an overhaul in the way the banking business has been traditionally looking at data i.e. something that is needed for reporting. Understanding of the data lineage and touchpoint systems is most crucial.

3. The CDO Dilemma (“To meta or not to meta”)

The CDO’s role in most banks is now well defined. The risk and compliance analytics and reporting division almost solely depends on the CDO function for insights on regulatory reporting and other forms of innovative data analytics. The key success factor of the CDO organization lies in allocation of the right set of analysts to the business areas. A CDO analyst on the market risk side, for instance, will have to be well versed with market data, bank hierarchies, VaR Calculation engines, Risk not in VaR (RNiV); supporting reference data in addition to the trade systems data that these data elements will have a direct or indirect impact on. Notwithstanding the critical data elements. An additional understanding of how this would impact other forms of risk reporting, like credit risk and non-financial risk is definitely a nice to have. Defining a meta-data strategy for the full lineage, its touch-points and transformations is a strenuous effort in analysis of systems owned by disparate teams with siloed implementation patterns over time. One fix that I saw working is that every significant application group / team can have a senior representative for the CDO interaction. Vested stakeholder interest is turning out to be the one major success factor in the programs that have been successful. This ascertains completeness of the critical data elements definition and hence aid data governance strategy in a wholesome way.

4. The ever-changing nature of financial risk management (“What did they change now?”)

The Basel Committee recommendations have been consistent in driving the urge to reinvent processes in the risk management area. With Fundamental Review of the Trading Book (FRTB) the focus has been very clearly realigned to data processes in organizations. Whilst the big banks already had demonstrated a sound understanding of modellable risk factors based on scenarios, this time the Basel committee has also asked banks to focus on Non-Modellable Risk factors (NMRF). Add the standard approach (sensitivities defined by regulator) and internal models approach (IMA – Bank defined enhanced sensitivities), the change from entity based risk calculations to desk based is a significant paradigm shift. Single golden-source definition for transaction data along with desk structure validation seems to be a major area of concern amongst banks.

Add climate risk to the mix with the Paris accord, the RWA calculations will now need additional data points, additional models and additional investment in external data defining the physical and transition risk associated. Data-lake / Big Data solutions with defined critical data elements and a full log of transformations with respect to lineage is a significant investment but will only work in favor of any more changes that come through on the regulations side. There have always been banks that have been great at this consistently and banks that lag significantly.

All and all, risk management happens to be a great use case for a greenfield CDO data strategy implementation, and these hurdles have to be handled before the ultimate Zen goal of a perfect risk data strategy. Believe me, the first step is to get the bank’s consolidated risk data strategy right and everything else will follow.

 

This is a 2021 article, also published here –  Risk Management Data Strategy – Insights from an Inquisitive Overseer | LinkedIn

]]>
https://blogs.perficient.com/2024/08/19/risk-management-data-strategy/feed/ 1 367560