Abstract We live in a time when automating processes is no longer a luxury, but a necessity for any team that wants to remain competitive. But automation has evolved. It is no longer just about executing repetitive tasks, but about creating solutions that understand context, learn over time, and make smarter decisions. In this blog, […]
Data + Intelligence
Databricks Lakebase – Database Branching in Action
What is Databricks Lakebase? Databricks Lakebase is a Postgres OLTP engine, integrated into Databricks Data Intelligence Platform. A database instance is a compute type that provides fully managed storage and compute resources for a postgres database. Lakebase leverages an architecture that separates compute and storage, which allows independent scaling while supporting low latency (<10ms) and […]
Celebrating Perficient’s Third Databricks Champion
We’re excited to welcome Bamidele James as Perficient’s newest and third Databricks Champion! His technical expertise, community engagement, advocacy, and mentorship have made a profound impact on the Databricks ecosystem. His Nomination Journey Bamidele’s journey through the nomination process was vigorous. It required evidence that he has successfully delivered multiple Databricks projects, received several certifications, […]
Monitoring Object Creation/Deletion in Cloud Storage with GCP Pub-Sub
When using cloud-based event-driven systems, it’s essential to respond to changes at the storage level, such as when files are added, modified, or deleted. Google Cloud Platform (GCP) makes this easy by enabling Cloud Storage and Pub/Sub to talk to one another directly. This arrangement lets you send out structured real-time alerts whenever something happens […]
Preparing for the New Normal: Synthetic Identities and Deepfake Threats
What’s it like to receive a phone call to say, “I need you to quickly transfer some funds,” and that person sounds just like your boss, but it is not your boss? Or somebody constructing a complete and realistic video of you stating something you never said? This is not science fiction, this is reality […]
Unlocking Business Success with Databricks One
Business users don’t use notebooks. Full stop. And for that reason, most organizations don’t have business users accessing the Databricks UI. This has always been a fundamental flaw in Databricks’ push to democratize data and AI. This disconnect is almost enshrined in the medallion architecture: Bronze is for system accounts, data scientists with notebooks use […]
Unlocking the Power of MLflow 3.0 in Databricks for GenAI
Databricks recently announced support for MLflow 3.0, which features a range of enhancements that redefine model management for enterprises. Integrated seamlessly into Databricks, MLflow is an open-source platform designed to manage the complete machine learning lifecycle. It provides tools to track experiments, package code into reproducible runs, and share and deploy models. With the launch […]
Leveraging Model Context Protocol (MCP) for AI Efficiency in Databricks
Model Context Protocol (MCP) is reshaping the way AI agents interface with data and tools, providing a robust framework for standardization and interoperability. As AI continues to permeate business landscapes, MCP offers particular advantages in creating scalable, efficient AI systems. This blog explores what MCP is, its role in the AI landscape, and focuses on […]
Using AI to Compare Retail Product Performance
AI this, AI that. It seems like everyone is trying to shoehorn AI into everything even if it doesn’t make sense. Many of the use cases I come across online are either not a fit for AI or could be easily done without it. However, below I explore a use case that is not only […]
Wire Room Math: AI + SME = (Less Compensation Paid) X (Headline Risk + Payment Errors)^2
At the absolute basic level, banks take money in and also send money out. Sort of like saying NASA launches spaceships and then bring them back. You know it’s more complicated than the elementary statement, but you know if you had to learn the more advanced functions of banks and NASA, it would likely be […]
Understanding Clean Rooms: A Comparative Analysis Between Databricks and Snowflake
“Clean rooms” have emerged as a pivotal data sharing innovation with both Databricks and Snowflake providing enterprise alternatives. Clean rooms are secure environments designed to allow multiple parties to collaborate on data analysis without exposing sensitive details of data. They serve as a sandbox where participants can perform computations on shared datasets while keeping raw […]
Revisiting Semantic Layers: Lessons from Their Rise and Return
For most of my data-focused career, I’ve been dealing with semantic layers one way or another. Either because the tool I was using to present data required it explicitly, or because the solution itself needed data to have relationships defined to make sense and be better organized. With the recent focus and hype on AI-infused […]