Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Follow Microsoft Technologies on Pinterest

Archive for the ‘SQL Server’ Category

On-Premises BI Gets a Boost in SQL Server 2016

At their recent Ignite conference in Chicago, Microsoft unleashed a flood of new information about their products and services across a wide variety of functions.   Business Intelligence was not left out, by any means, with announcements of exciting new cloud-based offerings such as Azure Data Warehouse and Azure Data Lake.  But given all the focus on Azure and cloud lately, one has to wonder: what about good ol’ SQL Server?  Well, wonder no more.

SQL Server 2016 will include a host of new features related to BI.  In fact, Microsoft claims that SQL Server 2016 will be one of the largest releases in the history of the product.  From hybrid architecture support to advanced analytics, the new capabilities being introduced are wide-ranging and genuinely exciting!

Providing an exhaustive list of new features and enhancements would be, well, exhausting.  And the information is currently covered in good detail on the SQL Server product website.   But here’s a few items that caught my eye from a BI perspective….

For classic SQL Server BI components:

  • SSDT/BIDS will now (finally) be unified in Visual Studio.  After the last few years of trying to get VS and SQL set up for development across various versions, this is a welcome change
  • SSAS Multidimensional is getting some attention (finally), with Netezza and Power Query being added as supported data sources.  Also expect some performance improvements, and support for DBCC.
  • SSAS Tabular is also getting some VERY welcome improvements: Power Query as a data source, support for Many-to-Many relationships (hallelujah!), additional new DAX functions, and some cool in-memory scalability enhancements
  • SSIS 2016 will also support Power Query, and will integrate with Azure in a number of very useful ways (an Azure Data Factory Data Flow task, for example), and will get some other helpful updates
  • SSRS, after being neglected for several releases, is getting a number of great improvements including additional chart types, cross-browser mobile support, improved parameter functionality, CSS support for custom report themes, and the ability to publish SSRS reports on Power BI sites!
  • Even Master Data Services (MDS) is getting some needed improvments, particularly around performance and security.

And on the Advanced Analytics front:

  • Revolution Analytics R is being integrated directly into the SQL Server relational database.  This will allow developers to access predictive analytics via T-SQL queries, and will support deploying R models as web services in the Azure Marketplace
  • PolyBase, the “secret sauce” in the PDW solution that allows T-SQL querying of both SQL Server and Hadoop data, will be available within SQL Server –WITHOUT needing an APS

So, clearly, lots of changes and enhancements are forthcoming in SQL Server 2016.  While Microsoft’s “cloud first, mobile first” initiative has left many on-premises SQL Server users feeling left out, SQL Server 2016 should bring a bright ray of hope.  We should expect to see Microsoft technology developed for cloud make its way into on-premises products going forward, and SQL Server is a perfect example of that trend.

BUILD & IGNITE Know It. Prove It. Tech Challenge Tour

KiPiTour

I recently blogged about my personal experiences with the first “Know it. Prove it.” challenge that ran through the month of February 2015. The “Know it. Prove it.” challenge is back! This time it’s bigger and better than ever. The new challenge is a companion to both the Build and Ignite Conferences with 11 amazing tracks for both Developers and IT Professionals. Also, just like the first round, this set of challenges are completely Free!

Join the tour and accept a challenge today.

Whether you’re looking to learn something new or just brush up on something you’re already using, there’s definitely a challenge track for you.

Read the rest of this post »

Webinar: Big Data & Microsoft, Key to Your Digital Transformation

big_data_shutterstock_wordpress
Companies undergoing digital transformation are creating organizational change through technologies and systems that enable them to work in ways that are in sync with the evolution of consumer demands and the state of today’s marketplace. In addition, more companies are relying on more and more data to help make business decisions.

And when it comes to consumer data – one challenge is the abundance of it. How can you turn complex data into business insight? The socially integrated world, the rise of mobile, IoT – this explosion of data can be directed and used, rather than simply managed. That’s why Big Data and advanced analytics are key components of most digital transformation strategies and serve as revolutionary ways of advancing your digital ecosystem.

Where does Microsoft fit into all of this? Recently, Microsoft has extended its data platform into this realm. SQL Server and Excel join up with new PaaS offerings to make up a dynamic and powerful Big Data/advanced analytics tool set. What’s nice about this is that you can leverage tools you already own for your digital transformation.

Join us next week, on Thursday, April 2 at 1 p.m. CT for a webinar, Transforming Business in a Digital Era with Big Data and Microsoft, to learn why you should be including Big Data and advanced analytics as components of your digital transformation and what options you have when it comes to Microsoft technology. Read the rest of this post »

Hybrid Analytics in Healthcare with O365 & Power BI Webinar Recap

Last week, we had our Microsoft last business intelligence focused webinar of the year, “Hybrid Analytics in Healthcare: Leveraging Power BI and Office 365 to Make Smarter Business Decisions.”  Heidi Rozmiarek, Assistant Director of IT Development for UnityPoint Health, spoke, along with our Microsoft BI team, on implementing an analytics platform in a hybrid environment. WebinarReplay

First, the Perficient team covered architectural components and functions, architecture options including on premises, hybrid, cloud,  and delivery considerations. Next, Steven Gregor, a technical consultant on our Microsoft BI team, reviewed Power BI and its features, including the security model and client side Data Management Gateway, and then walked through a live demo.

Last, Heidi shared how her organization is architecting a successful analytics infrastructure using Microsoft technologies. She explained how UnityPoint Health is leveraging Microsoft’s BI stack to provide simple solutions for complex questions. Heidi shared how they built the solution, collected and cleansed the data, modeled the data, and visualize and report the answer. She wrapped up by sharing her organization’s plans to move further to a hybrid on-premises/cloud solution in the next few months. Read the rest of this post »

Power BI Basics Inside Office 365 – A Video Series

Yesterday, we were fortunate to have a customer, Heidi Rozmiarek, Assistant Director of IT Development for UnityPoint Health, speak alongside our Microsoft BI team for the webinar, “Hybrid Analytics in Healthcare: Leveraging Power BI and Office 365 to Make Smarter Business Decisions.” power-bi

It was an informative session that began by covering architectural components and functions, architecture options including on premises, hybrid, cloud and delivery considerations. Following this, we had a live Power BI demo, and last but not least, Heidi shared how her organization is using the Microsoft BI stack to provide simple solutions for complex questions. Keep an eye out for a post describing the webinar in more detail, but in the meantime, you can view the replay here. 

Whether or not you attended the webinar, if you are interested in learning more about building a hybrid analytics platform with Power BI and Office 365,  I highly recommend you take a look at the following short video series.

  1. Introduction to Power BI:  The first video includes an introduction to Power BI, particularly around Power BI Sites, “My Power BI” and the Power BI Admin page.
  2. Administration and Permissions in Power BI: This video focuses on Site Admin and security basics.
  3. Data Exploration and Visualization in Power BI: The third video in the series discusses data exploration and visualization using Excel and related power tools, including Power Pivot and Power View.
  4. Data Management Gateway for Power BI: Here, we cover the steps to enable data feeds in Power BI using the Data Management Gateway.

Anglebrackets Conference – Day 1 Keynote

I’m lucky to be able to attend this year’s Anglebrackets conference in Las Vegas and I’ll try to cover the conference in this Perficient blog as much as I can. Today was the opening day of the conference, which actually consisted only of the opening keynote. The speaker was Scott Guthrie, Executive VP of Cloud and Enterprise group at Microsoft. He was wearing his signature red shirt. His keynote was titled, “The Cloud For Modern Business.”

image_dbbecd7b-9298-4dde-993a-acd9d9461515The following are my notes from his keynote:

Mobile first, cloud first.

Why cloud? Cloud enables:

1. Quick and easy deployment.
– No need to wait for provisioning. Demo: database deployed in Azure in few clicks. Sharepoint server farm deployed in few seconds.

2. Elastic capacity.
– no need to buy infrastructure
– unexpected load easily managed
– global coverage with unprecedented scale
Example: XBox One game Titanfall is completely cloud powered. 200,000 VMs were spun off on launch day.

3. Pay only for what you use
– no upfront costs
– no long-term commitment
– no wasted capacity
Example: slide with a typical web site usage pattern (a saw) illustrating unused capacity. Azure allows to avoid that by allowing automatic scaleup and down.

4. Enable new business value
– engage customers with web and mobile
– big data analytics
– machine learning Read the rest of this post »

Microsoft Azure updates for October

microsoft-azure-logo_11368901Every month Microsoft is releasing new Azure services and promoting other services from preview state to general availability. In October this year a few news services were released and a few graduated to general availability.

– Azure Automation is now generally available. Azure Automation is essentially a PowerShell scripting in the cloud. Microsoft was recommending to script Azure deployment tasks for a long time, but previously the scripting capabilities were limited by developer’s computer. Now, using  Azure Automation it’s possible to actually run PowerShell scripts in Azure cloud, create jobs and schedule them at given times, create automation workflows. These PowerShell workflows are called “runbooks”. Microsoft is providing a comprehensive catalog of ready to use runbooks made to automate and manage different part of Azure: web site, cloud storage, media service, VMs, etc.

– Azure Service Bus received a new feature – Event Hubs. Event Hubs is hyper-scalable pub/sub event ingestor which can ingest data from millions of telemetry events per second so it could be processed by Azure cloud services. Event Hubs is designed for use with “internet of things” (IoT) – cloud-connected devices with sensors.

– Microsoft Animalware for Cloud Services and VMs graduated to general availability. Microsoft Antimalware is a service and SDK enabling protection of cloud services and VMs from malware.

– Instance-level public IPs are now generally available. It’s now possible to directly assign a public IP to VM or a web or worker role. Limit of two public IPs per subscription was removed.

– Elastic Scale preview is now available for SQL Azure. Elastic Scale is a set of .NET libraries and management tools making horizontal scale-out (sharding) of SQL Azure servers easier. Sharding was a recommended scale-out (and scale-in) pattern for Azure SQL for a while. However, implementation of sharding required custom coding and writing management scripts (or manual management of SQL instances). Now it’s much easier to implement.

– Azure Batch is now in public preview. Azure Batch is new platform which is enabling user to run large scale parallel applications on thousands of virtual machines, auto-scale depending on work in the queue, monitor job progress and stage data and build computing pipelines.

– Stream Analytic is now available in public preview. Stream Analytics is a realtime event processing engine and built to process mullions of telemetric events per seconds when used together with Event Hubs.

– Data Factory is now available in public preview. Azure Data Factory is a framework for creating, managing and orchestration of data pipelines for connecting all kinds of data sources (SQL on premises or Azure, Azure tables or blobetc) to Hadoop cluster .

Why Does Data Warehousing Take So Long? Pt. 3

Last time, I posted about how BI/DW accelerator and framework tools need to be used with care, and not as a substitute for solid requirements analysis. This time, I want to debunk a misconception that can be framed by the following question: Will Agile processes speed up my BI development timeline?  

I see many situations where misconceptions about using Agile methods have gotten people and projects into a tight corner. And it’s not just in database/BI development, but in virtually every type of development. This is just one of them. But why is that? What is the disconnect here?

First, let’s just answer the question: No, Agile methods are probably not going to “speed up” any part of software development.   Sorry. :-/

As to “why”, I think the confusion arises from the fact that, since doing Agile means favoring working product over documentation, you will usually see a “first draft” of part of a solution pretty early on. This “draft” may have some functionality initially but, in most cases, for any sizable solution it won’t be much. Some screens/UI components might be in place, but it’s most likely that the screens won’t actually do much or take you anywhere. The point is, this “first draft” is reviewed by the business, feedback gathered and then it’s iterated upon to build more real functionality.   Build working software instead of models and document, and keep rebuilding and changing until it’s “done.”

And this gets to the core of what Agile methods are really about, which is about how to take in stride a continuously changing set of software requirements. While so-called “BDUF” methodologies lean on mechanisms for Change Control, Agile simply embraces and accepts that such changes are part of the deal. The Agile software development movement began as a response to inevitable requirements changes — not as a way to accelerate development.   The early prototypes you get out of the first sprints in an Agile project do not signal that the project will finish faster. It just means that during development you favor working software more than ancillary process and/or documentation.

So then with respect to Data Warehousing/Business Intelligence projects, if Agile isn’t going to speed things up, what is it actually good for?   Potentially a lot.

For instance, Kimball Method data warehouse development is already iterative, so it can be made to fit fairly well into an Agile context — albeit with some tweaking on matters like sprint length, sprint goals, etc. So this provides an interesting option for managing Kimball method development that may be a good fit for your environment.

Agile report development can also be quite worthwhile. Something I learned early in my consulting career was that it’s a lot easier to helpful responses from users if you show them something they DON’T want first. Starting with a simple working prototype of a report can be much more fruitful than starting with abstract specs and mockups.

In all, I think that Agile methods are best paired with data work in the following types of scenarios:

  1. Product Development – When you are starting from scratch, and have few or no solid requirements to start with, Agile is your friend. It will give you the opportunity to explore those requirements, stir up some ideas, and still come away with reasonably solid and reusable development progress.   However, whether to stick with Agile beyond initial prototyping is another question — mainly due to the fact that some DW/BI development tasks can be hard to fit into the Agile context for an Enterprise-level solution.
  2. Maintenance Mode – If you have a mature DW/BI solution that is live and in production, Agile can lend itself well to managing maintenance workloads and ongoing changes.   Agile maintenance can give IT departments greater flexibility in responding to sudden priority changes coming from the business.

Are those the only use cases? No, but that’s really a question for another post….   Bottom line: using Agile methods to develop your DW/BI solution does not automatically mean the project will run faster or deliver value sooner. But it can offer some serious non-speed-related benefits — depending on the context of the project.

Next time, I’ll wrap up this series with some ideas gettting value out of a DW/BI project as early as possible. Cheers!

Why Does Data Warehousing Take So Long? Part 2

In my last post, I wrote about BI’s reputation for being a long drawn-out process. And of course, where the market sees a vacuum…  A plethora of products and processes exist to aid with “shortcutting” BI/DW development. So this time, I want to look some common factors at play when you use the various types of frameworks, accelerators, and off-the-shelf products. I also want to point out how you can easily miss the boat on the long-term benefits of a BI solution by depending too much on these tools.

The market contains some pretty sophisticated enterprise metadata management tools, with full-featured user interfaces that provide drag-and-drop interaction with connected data sources, will auto-generate all kinds of artifacts, etc. On that end of the spectrum, you enter the realm of data virtualization and cloud-based solutions, big vendors with pretty huge brand names (i.e. Teradata, Informatica, IBM Cognos), Big Data, etc.  Although this level of tool does significantly more than just generically speed up the BI process, they do offer a collection of “accelerator” features. Down the cost scale, others tools in this segment are a little bit more “black box,” and will, say, create ETL from scratch based on comparing data models (like Wherescape RED), or generate dashboards by just pointing the tool at a set of tables (like Tableau or QlikView). And still others are merely frameworks, either ETL or front-end building blocks that essentially allow you to Lego yourself a BI solution (i.e. the growth of BIML for SSIS-based ETL).

Read the rest of this post »

Visualization options with Microsoft

I’ve been speaking to a lot of clients lately about the visualization capabilities of the Microsoft BI platform and want to clarify a point of confusion. When building an enterprise analytics platform you will be faced with several decisions around architecture as well as delivery. The architectural options will be vetted by your IT department, but in large part they will be driven by how you want to deliver and consume information in your organization. Typically there will be a balance between ‘traditional’ BI delivery and ‘self-service’ BI delivery.

What’s the difference? Traditional BI delivery comes in the form of reports and dashboards that are built by your IT department with tools such as SSRS or PerformancePoint. Both are solid tools with a lot of functionality. In contrast, most organizations are looking for ways to reduce their dependency on IT-built reports and therefore need a technology that enables their business users to be self-sufficient. This comes in the form of Excel with PowerPivot and PowerView.

A complete explanation of these new tools can be found here.

Feel free to contact us on how these tools can be used in your enterprise to delivery pervasive insights!