Joe Crabtree, Author at Perficient Blogs https://blogs.perficient.com/author/jcrabtree/ Expert Digital Insights Fri, 03 Feb 2017 00:54:24 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Joe Crabtree, Author at Perficient Blogs https://blogs.perficient.com/author/jcrabtree/ 32 32 30508587 How Mature is Your DevOps Practice? https://blogs.perficient.com/2017/02/02/how-mature-is-your-devops-practice/ https://blogs.perficient.com/2017/02/02/how-mature-is-your-devops-practice/#respond Fri, 03 Feb 2017 00:54:24 +0000 http://blogs.perficient.com/microsoft/?p=34803

We recently published a whitepaper titled “Leveraging the Microsoft Platform for DevOps.” In it, we describe our DevOps maturity model. How mature is your DevOps practice?

Some organizations are so mature, they are able to blow away and overwrite their production environment up to 1,600 times a day. That’s right; destroy the environment with no problem every time. It’s called immutable continuous delivery. We call this bleeding edge in our maturity model. They are doing hyper-deployments of their primary business application to production and overwriting it every time.
It’s even crazier to think about when you find out the company is Netflix!
Other organizations are using Microservices to deploy new code. Azure Service Fabric can be used to do round-robin deployments and keep systems running 24/7/365 with no interruptions. Super high-tech processes and mission critical systems can benefit from this approach.
And yet, some organizations are stuck using manual processes, don’t use CI/CD, or automated testing. We call these organizations Archaic. No offense, but if you haven’t started automating any deployment processes – you’re behind.
Most organizations, of course, fall somewhere in them middle. Maybe you’ve started to automate your deployment pipelines, but you haven’t thought about how that effects your operations teams. Or maybe you’ve established robust operational procedures, but lack the tooling to automate effectively.
Perficient can help. We have helped numerous organizations across the spectrum with all types of DevOps practices. We can guide you through the process, provide our industry best practices, and implement the tools you need to automate your delivery pipelines. Download the whitepaper from the link above and contact us anytime.

]]>
https://blogs.perficient.com/2017/02/02/how-mature-is-your-devops-practice/feed/ 0 225222
The Future of Azure PaaS – Cloud Dev for the Modern Enterprise https://blogs.perficient.com/2016/09/30/the-future-of-azure-paas-cloud-dev-for-the-modern-enterprise/ https://blogs.perficient.com/2016/09/30/the-future-of-azure-paas-cloud-dev-for-the-modern-enterprise/#respond Fri, 30 Sep 2016 18:55:47 +0000 http://blogs.perficient.com/microsoft/?p=33999

Welcome from the Microsoft Ignite Conference. It’s been a great conference this week. We’ve had lots of great conversation with customers, partners, and our friends at Microsoft. The conference is huge, both in number of people and the area. I’ve definitely got my steps in for this week 🙂
It’s never been a more exciting time to be a Microsoft Developer. With the investments in cloud, open source, and now Microsoft’s open arms approach to competitors, you really can code your application in any language for any platform.
In talking with our customers, its clear that the full capability of Azure PaaS is the best kept secret in the industry. (I’ll assume most of you who are reading this, likely know what PaaS is. If not, please contact me and I’d be happy to explain further for you.)
Most customers start their cloud journey with IaaS, rather than PaaS. And that’s ok. It’s the easiest way to envision the what the cloud can do for you. You have applications deployed today on web servers in your data center. Building VMs in the cloud allow you to replicate the same functionality and deployment methodologies you use today.
However, you are not getting the most out of the cloud with a strictly IaaS solution. By moving that same app over to PaaS, you can expect a large increase in ROI, up to 400% or more, a 50% increase in your time-to-market, and save up to 80% on your IT time. (According to a Gartner Study published in June 2016 and referenced in the Ignite presentation, I couldn’t find the direct link though) Gartner did name Microsoft as a Leader in the Enterprise Application Platform as a Service space. Notice there is one large name missing from this chart, cough cough, Amazon.
So, what are the various PaaS options that Azure provides:
Azure App Service
You’ve likely heard of this one. App Service is composed of 4 different types of apps – Web Apps (formerly called websites), Mobile Apps, Logic Apps, and API Apps. The App Service platform provides support for all of your enterprise-grade custom apps, hybrid support, a global data center footprint for geo-replication, it’s secure and compliant, and integrates with Azure AD. It’s also a full managed platform, providing: built-in auto-scale and load balancing, high availability with auto-patching, and backup/recovery. As I mentioned above, it enables high productivity development of .NET, Java, PHP, Node, and Python. In addition, there are really advanced capabilities to use deployment slots where you can deploy to a built-in stage slot; then utilize another feature called Testing in Production to route a small % of your traffic to ensure its stable, and finally you can freely swap between slots in seconds. Extremely powerful stuff for developing modern applications for the cloud!
Azure Functions
This service was introduced earlier this year at Build Conference. Azure Functions allow you to run server-side code in a server-less environment. Microsoft describes it as a “no ops” solution. Meaning that you don’t have to build, compile, or deploy the code. Rather, you add your code to a GUI in the browser, then the service executes that “function” on an event-driven basis. I struggled with how to think about this service initially. The best use-case I have found is for Office 365 development. Let’s say you want to build a new App Part for SharePoint Online using the new SP Framework. The code you write for the app part will be all client-side, hitting the O365 API’s, pushing and pulling data. But, you have a small requirement that can only be accomplished using server-side code. This is where Functions can help. Your app part client-side code can call into the Function and get the server-side code result. A pretty cool option for those who don’t want to build and deploy a full-blown Provider Hosted App to execute that server-side code.
Azure Service Fabric
Service Fabric is an enterprise-grade solution to microservices. I sat in on a number of sessions this week on that topic. Service Fabric is a super interesting solution. It’s a VM hosted platform – meaning that you can build a cluster of servers in Azure, on-premises, or in another cloud. So it’s not quite fully PaaS. Azure Service Fabric delivers fast in-place upgrades with zero downtime, auto-scaling, integrated health monitoring, and service healing. It also provides orchestration and automation for microservices, which gives new levels of app awareness and insight to automate live-upgrades with rollback and automatic scale-up and scale-down capabilities. Plus, you can solve hard distributed system problems such as failover, leader election, state management and provide application life-cycle management capabilities so developers don’t have to re-architect applications as usage grows.
With these robust offerings, Microsoft is definitely the enterprise-grade solution to Platform as a Service. Contact us at Perficient and one of our certified Azure consultants can help envision your Azure solution today!

]]>
https://blogs.perficient.com/2016/09/30/the-future-of-azure-paas-cloud-dev-for-the-modern-enterprise/feed/ 0 225181
Build 2016: The Future of Microsoft Identity https://blogs.perficient.com/2016/04/11/build-2016-the-future-of-microsoft-identity/ https://blogs.perficient.com/2016/04/11/build-2016-the-future-of-microsoft-identity/#respond Mon, 11 Apr 2016 20:46:12 +0000 http://blogs.perficient.com/microsoft/?p=31297

buildAt Build 2016 last week, Vittorio Bertucci, Program Manager at Microsoft presented a session titled, “Microsoft Identity: State of the Union and Future Directions”. https://channel9.msdn.com/Events/Build/2016/B868
Most Identity conversations usually discuss Forests, Domains, and Active Directory Administration. This talk is for developers and Vittorio makes that clear in a very humorous way from the beginning.
Microsoft’s vision statement is “To be the BEST Identity system for ALL developers”, which encompasses three key pillars:

  • Reaches the audience you want
  • Great fundamentals – it must be secure, available, geo-replicated, respect privacy, and have a great user experience
  • Works great with your solution

To help understand the problem, we need to understand what types of identity systems currently exist. It’s broken down into 2 primary channels – Organization and Individuals. Vittorio’s slide below does a great job of summarizing:
build ident

  • Organizations (top to bottom):
    • Cloud Directories
    • Federation Capable IDP’s
    • Active Directory
    • On Premises Directories
  • Individuals
    • Local Accounts
    • Emerging IDP’s
    • OpenID Connect Capable IDP’s
    • Established Social IDP’s

Microsoft cares about all of these various types of providers. For some, a solution exists today; for others, Microsoft is working on a solution. According to Vittorio, they are committed to addressing all of them.
The importance of strong fundamentals…

  • Constantly evolving, enterprise grade security.
  • High scale, availability, performance, disaster recovery, compliance, geo replication
  • Privacy features, locality, sovereign clouds
  • Affordable, pay as you go

Vittorio discussed each of these in detail, but they are really self-explanatory. We all know how important security is. You don’t want to have your identity security compromised.
Finally on the key pillars, what does it mean to “work great with your solution”?
First, the solution must use open protocols. If you go out and create your own protocol that no one knows about, it won’t be compatible with any other system. Second, it must have great artifacts for your OS and dev stack – which really means documentation. You must be able to find detailed information about the solution for proper implementation. Next, it must have great management and lifecycle features. Finally, it must have a great user experience.
How do we get there?
On-premises Active Directory has been available since 1999 and is the most widely used business identity solution. AD also has federation capabilities through AD FS, which allows your identity solution to authenticate trusted users from third parties.
What happens when you move your app to the cloud? On-premises AD FS can still support authentication, but it can’t tell you more information about the user you may be interested in. For instance, what groups are they a member of or who is their manager. Also, AD FS has serious scale limitations. Each third-party must maintain their own AD FS infrastructure and you must connect to each one individually. That’s a problem if you wish to have tens or hundreds of thousands of customers.
How do we solve these problems? Enter Azure Active Directory. Microsoft faced the very same problems when designing Office 365. So they came up with the idea of IDaaS – Identity as a Service. This streamline’s the ability to stand up cloud workloads. It uses open authentication and authorization standards. And Azure AD Sync allows you to project your On-premises AD to the cloud.
Azure AD Fundamentals
Microsoft has built 30 data centers around the world, 22 in production and 8 announced. This is more than the next 2 competitors combined. I heard this stat a number of times at Build. Simply stated, Azure AD is the most available, scalable, and geo replicated solution on the market capable of supporting the world’s largest enterprises.
Azure AD provided intelligent, ever evolving security. I first heard about Microsoft’s Advanced Threat Analytics last year at the Ignite Conference. It’s really advanced technology. If a user logs in from Chicago at 1:38pm and the same user tries to authenticate from Hong Kong at 3:45pm, then Azure AD will block the second authentication request. The ATA knows this must be a hacking attempt as a user cannot physically be in Chicago and Hong Kong on the same day. Microsoft also has a team that watches black markets for identities that are being sold. If you lose your identity, Microsoft will notify you that it has been compromised. Very cool!
Azure AD also provides geo replication and disaster recovery natively. In addition, it provides data sovereignty capabilities for regions such as Germany where data privacy laws are very strict. Azure AD works on any device or platform, including Android, iOS, Java, Ruby, etc. Finally, Microsoft offers a free tier which makes the solution ultra affordable for every use case.
Now what happens if we want to connect to more IDP’s or have more control? Enter Azure AD B2C. This solution allows you to white label an authentication solution. It runs on the same infrastructure as Azure AD, which is scalable, secure, provides Multi-factor Authentication support, and has open protocol integration. It allows you to connect with local accounts, social providers, and has email verification. And it provides a fully customizable user experience. Azure AD B2C is your one-stop-shop solution if you want to accept Microsoft Service Accounts, Google Plus, Facebook, Amazon, or LinkedIn accounts.
What happens if you want to include organization accounts to Azure AD B2C? You wouldn’t want to have 2 sets of code, one for organizational accounts (commercial) and one for individual accounts (consumer). That would be too much work, no one likes to support 2 of something. So Microsoft has just announced the unification of Azure AD and the Microsoft Service Account (MSA).
This unification provides 1 registration portal, 1 endpoint and protocol conventions, and 1 set of libraries (new and improved)! Microsoft recently went through this with the Microsoft Graph API. If you are familiar with some of those legacy challenges, you’ll get this analogy. Now the same has been done with identity and Microsoft is providing the new Microsoft Identity Library (MSAL).
build ident22
Great session, Vittorio! I hope you all found something useful from this session. Azure AD and B2C are great emerging technologies that can help fuel your Digital Cloud Transformation.

]]>
https://blogs.perficient.com/2016/04/11/build-2016-the-future-of-microsoft-identity/feed/ 0 225127
Build 2016: Azure Lessons Learned from Microsoft Azure CTO https://blogs.perficient.com/2016/04/01/build-2016-azure-lessons-learned-from-microsoft-azure-cto/ https://blogs.perficient.com/2016/04/01/build-2016-azure-lessons-learned-from-microsoft-azure-cto/#respond Fri, 01 Apr 2016 13:00:59 +0000 http://blogs.perficient.com/microsoft/?p=31135

buildMark Russinovich, CTO of Microsoft Azure, hosted a session yesterday at Build titled “Building Resilient Services: Learning Lessons from Azure”. This was a great session where he detailed real world examples of how Microsoft has failed while deploying new features and functionality to Azure.
Building, upgrading, and maintaining a platform where Enterprises deploy mission critical applications is no easy task. Can you imagine how many different ways a developer could deploy a bug that would take out services worldwide for millions of users? It’s kind of a daunting thought.
This session was all about those failures, what went wrong, how they resolved it, and best practices for everyone else to avoid making these same mistakes in the future. Here are my best practice notes from the session:

  • If an application fails, it is likely to flood the error logs with data. Limit logs with a quota.

This example has probably happened to all of us as developers. In Microsoft’s case, the flooding log was causing severe disk space and memory issues that spread and took the service down. The best practice here makes sense, limit the log from growing so large that it takes down the entire service.

  • Don’t ignore or suppress warnings.

This goes along with the above. If your application is spitting out an error, you should pay attention to it and get it resolved. You may be filling up your error logs with unnecessary data or even worse, you may be ignoring a critical piece of functionality that hasn’t been reported yet. Don’t ignore the warnings.

  • Log like everybody is watching.

Can you imagine a large development team like Microsoft, when they have an error, the engineer who wrote the code is almost certainly not the engineer who will fix the problem. The first step in fixing a problem is identifying it’s cause. You can only do that when the error log has enough detail to lead you to that root cause. The examples that Mark described cost their support engineers countless hours by trying to track down vague errors that didn’t point to a specific function or method call. A little extra detail can go a long way.

In the next section, Mark discussed Exceptional Coding – or handling exceptions. It’s very difficult to have universal exception coverage. Third party code can often throw an error and you may not be able to determine the cause. One part of the debate is, do you Fail Fast or Catch-all? If you let your code fail fast, it can bring down your entire application. Sometimes this can be good as you can identify and error quickly and resolve it. Other times, that scenario can be disastrous. Conversely, if you catch all exceptions you may not know there is a persistent error condition. Visual Studio says not to catch general exceptions. Mark goes into a bit more detail here on what types of exceptions you should be catching and which you should not. The best practice here:
  • Log once per hour per exception per machine

This relates to the first best practice as well. Limiting how you log exceptions will greatly reduce your log file quota usage.
When investigating errors, an engineer will always examine the error log. The log is time stamped. Often one error will cause another and you often get started at the bottom of the error chain. In order to look at the full chain, you need to match errors sequentially based on time. What happens when your machines aren’t sync’d to the same time zone? Mark showed one example where the first error was logged in Pacific Time and the other errors were being logged in UTC. Best practice:

  • Always use UTC when logging errors

Well this post is getting pretty long, so I won’t go into full detail on the rest. You can watch the replay of Mark’s session here on channel 9 – https://channel9.msdn.com/Events/Build/2016/B863
Other best practices:

  • Store secrets and keys in Azure Key Vault
  • Avoid global services, partition instead
  • Test in a canary environment with production load
  • Bake in production for 24 hours
  • Isolate environments and keys, don’t let them see or talk to each other
  • Isolate endpoints and prevent cascading failures
  • Ensure graceful degradation

I hope this post is helpful for all you Azure Architects out there. Stay tuned to this blog for more great content from Build.
 
 

]]>
https://blogs.perficient.com/2016/04/01/build-2016-azure-lessons-learned-from-microsoft-azure-cto/feed/ 0 225122
Build 2016: Day 1 Highlights https://blogs.perficient.com/2016/03/31/build-2016-day-1-highlights/ https://blogs.perficient.com/2016/03/31/build-2016-day-1-highlights/#respond Thu, 31 Mar 2016 20:26:03 +0000 http://blogs.perficient.com/microsoft/?p=31126

buildBuild Conference 2016 is now in day 2. I’m at lunch and finally have a chance to write-up my thoughts about day 1.
Lots of cool new stuff coming down the pipe for developers – biometric identity, pen integration, Hololens, Cortana improvements, Xbox dev kit, and much more.
Windows Hello
This is Microsoft’s biometric identity framework. In Windows 10, you have the option to set up a fingerprint or iris scan identity. When you log on to your Surface, you’ll get access without providing a password. Pretty cool technology, something I feel like has been around for a while, but it’s still not widely used. During the keynote, they showed a demo of an application that has extended with Hello, bringing the same biometric login experience to the app. To me, this is the next evolutionary step of security. I dream of a world with no passwords.
Pen
The Surface has been a huge hit (since the Surface Pro 3 at least). I love mine and it’s what I’m using to write this blog. During the conference, I use the pen to take notes in OneNote and it’s amazing. I really like the pen experience. Microsoft has extended those capabilities now to developers. You can build your Universal Windows Apps (UWA) to include a pen taskbar with 2 simple lines of code. The pen experience enables mouse over events, dynamic highlighting, and includes a virtual rule so you can draw straight lines. Big reaction from the crowd on that demo. Microsoft also announced a partnership with Adobe to integrate pen functionality into Adobe Illustrator. Very cool for you graphic designers out there.
Hololens
Microsoft’s vision for Virtual Reality is better described as Augmented Reality. Current Samsung enable devices for VR require a phone that plugs into the headset and the headset is fully immersive – meaning you have no idea what’s happening in the real world around you. The Hololens projects its 3D imagery onto the world around you. The headset is self-contained and does not require a phone. The glasses look like dark shaded sunglasses. The demos have all been really cool. NASA has a booth here that I plan on checking out for their Mars Exploration experience, which will debut this summer at the Kennedy Center. Microsoft needs developers to get behind this technology and partners to start building product around it. While the dev kits started delivery this week, I’m guessing we are 2-3 years minimum before consumers will be able to use this technology.
Cortana
Microsoft demoed some new features for Cortana. You can now build bots that will interact with Cortana. The demo was a voice activated Domino’s order that interacted with a bot. I can see some real world uses for this technology, but I’m not the biggest fan of a Personal Digital Assistant. Call me old-fashioned, but this tech is a little too tech for me. I prefer to order my pizza using and app, not my voice.
Xbox
I still have an Xbox One, I love it, but I’ve never done any game development. Microsoft announced a new software dev kit that will turn any commercial unit into a dev machine. The demo was really cool, it enables real-time debugging and attaching your Visual Studio to your Xbox. So I’m sure that’s huge for game devs out there.
Lots more to come, stay tuned to this blog for more product announcements and wrap-ups.
 

]]>
https://blogs.perficient.com/2016/03/31/build-2016-day-1-highlights/feed/ 0 225120
Introduction to ASP.NET Core 1.0 https://blogs.perficient.com/2016/02/18/introduction-to-asp-net-core-1-0/ https://blogs.perficient.com/2016/02/18/introduction-to-asp-net-core-1-0/#respond Fri, 19 Feb 2016 00:01:32 +0000 http://blogs.perficient.com/microsoft/?p=29163

Also known as, ASP.NET 5, this introduction will cover some basic topics on the new framework. Microsoft hosted a free all-day training session this week at the MTC and I was able to attend. It was my first in-depth look at all the new features and I was quite surprised by some.
The last version of the ASP.NET 4 framework is 4.6. The new ASP.NET Core framework will be backwards compatible and will run on ASP.NET 4.6 libraries. Moving forward, Microsoft will use the “Core” terminology and will discontinue the old numbering system. So there will never be an ASP.NET 5. Don’t worry, I’m confused too, smh.
The below diagram illustrates the architecture
asp.net core
According to Microsoft, the new version represents a significant redesign –

ASP.NET 5 is a new open-source and cross-platform framework for building modern cloud-based Web applications using .NET. We built it from the ground up to provide an optimized development framework for apps that are either deployed to the cloud or run on-premises. It consists of modular components with minimal overhead, so you retain flexibility while constructing your solutions. You can develop and run your ASP.NET 5 applications cross-platform on Windows, Mac and Linux. ASP.NET 5 is fully open source on GitHub.

Note the language as ASP.NET 5 is the same as ASP.NET Core 1.0. Microsoft is in the process of updating all documentation, but the framework is still in RC 1 so it’s a lower priority vs finishing the development.
The biggest, and most surprising, change for me was the reduced role of NuGet. If you’re not familiar, NuGet is a package management system for code dependencies on the Microsoft development platform. I was unaware there are competitors in that market who provide a better service.
Moving forward VS 2015 will use NPM, Bower, Grunt, and Gulp to import dependency packages. The downside with NuGet is that you have to manage an actual package. When the underlying package contents need to be updated, you have to re-package. Using the other new services, they simply create a pointer to the most recent version. So when the underlying framework is updated, the service will automatically grab the new one without re-packaging.
There are lots of other new features and changes. Notably the framework is no longer built on System.Web.dll, bowing to a more granular set of set of optimized packages. Also, WebUI is being fully integrated into the new platform. This means you do not need a separate WebUI project to create your RESTful services; you will be able to create them in any Web Application project. Finally, dependency injection support is being added natively. This is a welcome change for everyone who uses Ninjet, Spring, or Unity.
There’s lots more to talk about with the new framework and MVC6. Stay tuned to my blog for more updates!
You can find more details from Microsoft here: https://docs.asp.net/en/latest/conceptual-overview/aspnet.html

  • New light-weight and modular HTTP request pipeline
  • Ability to host on IIS or self-host in your own process
  • Built on .NET Core, which supports true side-by-side app versioning
  • Ships entirely as NuGet packages
  • Integrated support for creating and using NuGet packages
  • Single aligned web stack for Web UI and Web APIs
  • Cloud-ready environment-based configuration
  • Built-in support for dependency injection
  • New tooling that simplifies modern web development
  • Build and run cross-platform ASP.NET apps on Windows, Mac and Linux
  • Open source and community focused

 

]]>
https://blogs.perficient.com/2016/02/18/introduction-to-asp-net-core-1-0/feed/ 0 225094
Introducing Microsoft Graph: formerly Office 365 Unified API https://blogs.perficient.com/2015/12/07/introducing-microsoft-graph-formerly-office-365-unified-api/ https://blogs.perficient.com/2015/12/07/introducing-microsoft-graph-formerly-office-365-unified-api/#respond Mon, 07 Dec 2015 19:53:26 +0000 http://blogs.perficient.com/microsoft/?p=28575

As more and more data moves to the cloud and adoption of cloud SaaS products continues to increase, we need more sophisticated ways to extract, transform, and interact with that data. This spring at the //build conference, Microsoft previewed the “Office 365 Unified API”. That product is now in full production and has been renamed the Microsoft Graph.
graph_illustration_horizontal_948x215
If you’ve been around the community, you may be doing a double take. Wait, don’t we already have the Office Graph? Yes, and that’s a different product; the Office Graph is the machine learning system that powers Office Delve. This is a completely different product, although they appear to be a connected. Don’t get me started on Microsoft and their naming conventions, as I’m sure this will bring tons of confusion when speaking with customers.
What is the Microsoft Graph?
Simply stated, it’s a REST API with a single endpoint. Using the Microsoft Graph, you can turn formerly difficult or complex queries into simple navigations.
The Microsoft Graph gives you:

  • A unified API endpoint for accessing aggregated data from multiple Microsoft cloud services in a single response
  • Seamless navigation between entities and and the relationships among them
  • Access to intelligence and insights coming from the Microsoft cloud

You can use the API to access fixed entities like users, groups, mail, messages, calendars, tasks, and notes coming from services like Outlook, OneDrive, Azure Active Directory, Planner, OneNote and others. You can also obtain calculated relationships powered by the Office Graph like the list of users you are working with or the documents trending around you.
For more information, visit the full release here on the Office Blog:

Unified Microsoft API endpoint for accessing the capabilities of the Microsoft cloud
The Microsoft Graph today exposes APIs, data and intelligence across Office 365 and Azure AD. We are building toward a near future where multiple graphs and all APIs throughout Microsoft contribute to, and are accessible through, a single unified gateway to the power of the Microsoft cloud.
Any developer capable of making an HTTP request can call the API, from any platform, and once-siloed Office 365 services can now be directly navigated via Microsoft Graph. For developers, what used to be 50+ lines of code are now cut to five.
We’re also releasing SDKs to make the Microsoft Graph as useful to developers as possible. We’re starting with .Net, iOS and Android and then expanding to other platforms like Node.js, Python, Java and Ruby. Code samples for a variety of platforms are available on GitHub.
Unified access to rich data living in the Microsoft cloud
You can also think of the Microsoft Graph as the gateway for developers to access the rich data living in the Microsoft cloud.
The opportunities for developers to shape the way the world works are endless. Within the Office 365 surface area alone, consider the amount of data we have with:

  • More than 18 million consumer Office 365 subscribers.
  • 60 million commercial Office 365 monthly active users.
  • More than half a billion people managing their documents and photos in OneDrive.
  • Over 200 million downloads of Office mobile (WXP, Outlook, OneNote on iOS and Android mobile devices).

Unified access to intelligence and insight coming from the Microsoft cloud
The Microsoft Graph is the consistent endpoint for developers to access intelligent insights that Microsoft builds in the cloud.
And because the Microsoft Graph has access to your activities (e.g. documents, calendars, meetings), it can be used to address a ton of critical work and productivity questions, such as:

  • Who does the user work closely with?
  • What documents and topics are important to my colleagues right now?
  • What matters the most to my boss?

With the Microsoft Graph, developers are empowered to build smart, people-centric applications that can easily interact with data from all touchpoints of modern work.

 

]]>
https://blogs.perficient.com/2015/12/07/introducing-microsoft-graph-formerly-office-365-unified-api/feed/ 0 225060
Azure DevOps: Predictability in the Cloud with Azure Automation https://blogs.perficient.com/2015/12/02/azure-devops-predictability-in-the-cloud-with-azure-automation/ https://blogs.perficient.com/2015/12/02/azure-devops-predictability-in-the-cloud-with-azure-automation/#respond Wed, 02 Dec 2015 14:38:50 +0000 http://blogs.perficient.com/microsoft/?p=28524

Azure automationWhy to automate anything should be obvious. In addition to efficiency, there are also considerations on consistency, repeatability, and predictability to programmatically carry out tasks. Considering DevOps, automation is an effective vehicle to minimal user interventions from both Dev and Ops for establishing application infrastructure, configuring run-time, and deploying a target application. This automation provides consistency and predictability of application deployment with transparency to both Dev and Ops. The theme is that DevOps calls for automation and automation sets DevOps in motion.
Microsoft Azure Automation allows you to automate the creation, monitoring, deployment and management of cloud resources in your Microsoft Azure subscription using a highly-available workflow execution engine.  Azure Automation provides an orchestration feature set for public cloud resources that is similar to what the Service Management Automation (SMA) engine provides for on-premises private cloud resources via the Windows Azure Pack and System Center 2012 R2 Orchestrator. Orchestrator as the name suggests is a powerful component in System Center to for automating and orchestrating a data center. You can consider it as a turbo DevOps engine leaning towards the Ops side.
Azure Automation is very effective because it allows you to perform automated cloud provisioning and management without needing to manually build and manage a separate set of automation servers.  Also, scalability and high availability of the Azure Automation engine is provided natively via the Microsoft Azure cloud platform without any extra configuration steps, which helps to make sure that your scheduled runbooks will always execute when needed.
Check out this 30 min video with Rick Claus, Joe Levy, and (surprise guest) Jeffrey Snover as they explore new features of Azure Automation, as part of the Microsoft Operations Management Suite, that have turned Azure Automation into a powerful, reliable solution for automation of IT management tasks on-premises or in any cloud, for Windows and Linux — using the tools you’re already familiar with.

]]>
https://blogs.perficient.com/2015/12/02/azure-devops-predictability-in-the-cloud-with-azure-automation/feed/ 0 225057
Introduction to Microservices using Azure Service Fabric https://blogs.perficient.com/2015/11/30/introduction-to-microservices-using-azure-service-fabric/ https://blogs.perficient.com/2015/11/30/introduction-to-microservices-using-azure-service-fabric/#respond Mon, 30 Nov 2015 22:22:47 +0000 http://blogs.perficient.com/microsoft/?p=28519

Azure_LogoToday’s Internet scale services are built using microservices. Example microservices are protocol gateways, user profiles, shopping carts, inventory processing, queues, caches, etc. Microservices can further be defined by:

  • Is (logic + state) independently versioned, scaled, and deployed
  • Has a unique name that can be resolved
  • Interacts with other microservices over well-defined interfaces like REST
  • Remains logically consistent in the presence of failures
  • Hosted in a container (code + config)
  • Can be written in any language or framework

The Microsoft Azure Service Fabric is a microservices platform giving every microservice a unique name that can either be stateless or stateful.
Stateless microservices (e.g. protocol gateways, web proxies, etc.) do not maintain any mutable state outside of any request and its response from the service. Azure Cloud Services worker roles are an example of stateless service. Stateful microservices (e.g. user accounts, databases, devices, shopping carts, queues etc.) maintain mutable, authoritative state beyond the request and its response. Today’s Internet scale applications consist of a combination of stateless and stateful microservices.
Why are stateful microservices important? Why not simply use stateless services for everything? Two reasons:
1) The ability to build high-throughput, low-latency, failure-tolerant OLTP services like interactive store fronts, search, Internet of Things (IoT) systems, trading systems, credit card processing and fraud detection systems, personal record management etc by keeping code and data close on the same machine.
2) Application design simplification as stateful microservices remove the need for additional queues and caches that have traditionally been required to address the availability and latency requirements of a purely stateless application. Since stateful service are naturally highly-available and low-latency this means fewer moving parts to manage in your application as a whole.
Azure Service Fabric enables you to build and manage scalable and reliable applications composed of microservices running at very high density on a shared pool of machines (commonly referred to as a Service Fabric cluster). It provides a sophisticated runtime for building distributed, scalable stateless and stateful microservices and comprehensive application management capabilities for provisioning, deploying, monitoring, upgrading/patching, and deleting deployed applications.
Service Fabric powers many Microsoft services today such as Azure SQL Databases, Azure DocumentDB, Cortana, Power BI, Microsoft Intune, Azure Event Hubs, many core Azure Services, and Skype for Business to name a few.
Service Fabric is tailored to creating “born in the cloud” services that can start small, as needed, and grow to massive scale with hundreds or thousands of machines, creating Service Fabric clusters across availability sets in a region or across regions.
Service Fabric provides comprehensive runtime and lifecycle management capabilities to applications composed of these microservices. It hosts microservices inside containers that are deployed and activated across the Service Fabric cluster. Just like an order of magnitude increase in density is made possible by moving from VMs to containers, a similar order of magnitude in density becomes possible when moving from containers to microservices. For example, a single Azure SQL Database cluster, which is built on Service Fabric, comprises of hundreds of machines running ten of thousands of containers hosting a total of hundreds of thousands of databases (each database is a Service Fabric stateful microservice). The same is true of Event Hubs and the other service mentioned above. This is why the term hyperscale can be used to describe Service Fabric capabilities — if containers give you high density, then microservices give you hyperscale.
service-fabric-overview
Service Fabric provides first class support for the full application lifecycle management (ALM) of cloud applications: from development to deployment, to daily management, to maintenance, and to eventual decommissioning.
The Service Fabric ALM capabilities enable application administrators/IT operators to use simple, low-touch workflows to provision, deploy, patch, and monitor applications. These built-in workflows greatly reduce the burden on IT Operators to keep applications continuously available.
Most applications consist of a combination of stateless and stateful microservices and other EXE/runtimes that are deployed together. By having strong types on the applications and the packaged microservices, Service Fabric enables the deployment of multiple application instances each of which can be managed and upgraded independently. Importantly, Service Fabric is able to deploy any executables or runtimes and make these reliable. For example it can be used to deploy ASP.NET 5, node.js, scripts, or anything that makes up your application.
For more information on Azure Service Fabric and Microservices, contact us at Perficient and one of our 28 certified Azure consultants can help envision your solution today!

]]>
https://blogs.perficient.com/2015/11/30/introduction-to-microservices-using-azure-service-fabric/feed/ 0 225056
Big News – Yammer Deprecating SSO and DirSync! https://blogs.perficient.com/2015/11/19/big-news-yammer-deprecating-sso-and-dirsync/ https://blogs.perficient.com/2015/11/19/big-news-yammer-deprecating-sso-and-dirsync/#respond Thu, 19 Nov 2015 17:32:25 +0000 http://blogs.perficient.com/microsoft/?p=28418

Yammer-logoIf you are a Yammer customer, you will know that the identity and directory synchronization has been separate from the rest of Office365. This means that you need separate servers and software installations to sync users from your on-premises AD and provide SSO capabilities to Yammer. If you are unsure about this configuration, check out my post from last year that details all the particulars.
Yesterday, Microsoft announced it is finally deprecating the Yammer specific tools. From the announcement –

Yammer single sign-on (SSO) and directory synchronization (DSync) are legacy tools that Yammer developed being acquired by Microsoft. As Yammer gets closely integrated with Office 365, we are removing the need for customers to learn and maintain separate tools for Yammer. Instead, customers can use the familiar Office 365 tools to setup single sign-on (Office 365 sign-in with federated identity) and directory synchronization (Azure Active Directory Connect).

This is a huge announcement! Microsoft has finally integrated the identity functions for Yammer into the rest of O365. Here is the schedule:
Yammer SSO and DSync are now deprecated tools. Yammer networks that currently use Yammer SSO will need to transition to using Office 365 sign-in for Yammer. Networks using Yammer DSync will need to transition to using Azure Active Directory synchronization.
Important dates for this deprecation:

  • November 18, 2015:    Announcement to deprecate Yammer SSO and Yammer DSync tools.
  • April 1st, 2016:   Yammer networks will not be allowed to set up new configurations of SSO or DSync.
  • December 1st, 2016:   Yammer SSO and DSync will stop working.

After December 1st, 2016, if you do nothing, the following changes will take place:

  • Yammer SSO:   Yammer single sign-on will stop working. The network will start using Office 365 identity (if the network is associated with an Office 365 tenant) or Yammer identity (username and password sign-in).
  • Yammer DSync:   Yammer directory synchronization will stop working. No further changes to your on-premises Active Directory will be reflected in Yammer.

For more information, see the full announcement here.

]]>
https://blogs.perficient.com/2015/11/19/big-news-yammer-deprecating-sso-and-dirsync/feed/ 0 225049
Innovation with Azure Machine Learning and Dynamics AX https://blogs.perficient.com/2015/11/16/innovation-with-azure-machine-learning-and-dynamics-ax/ https://blogs.perficient.com/2015/11/16/innovation-with-azure-machine-learning-and-dynamics-ax/#respond Mon, 16 Nov 2015 21:35:20 +0000 http://blogs.perficient.com/microsoft/?p=28370

On a call with Microsoft this morning, they referenced this public case study. I thought it was a really nice example of using a multitude of Azure services to innovate their business – Machine Learning, Mobile Services, IoT Hubs, and Dynamics AX. Check out the video below and the full description here – https://customers.microsoft.com/Pages/CustomerStory.aspx?recid=12792

]]>
https://blogs.perficient.com/2015/11/16/innovation-with-azure-machine-learning-and-dynamics-ax/feed/ 0 225048
Azure DevOps: Scale Out Your Build System https://blogs.perficient.com/2015/11/13/azure-devops-scale-out-your-build-system/ https://blogs.perficient.com/2015/11/13/azure-devops-scale-out-your-build-system/#respond Fri, 13 Nov 2015 14:35:21 +0000 http://blogs.perficient.com/microsoft/?p=28325

Azure_LogoEvery developer knows that builds are an integral piece to the Application Lifecycle. Using an automated build and testing process will help speed the time to market for your application. Visual Studio and Team Foundation Server offers a number of features to help with this process.
To use Team Foundation Build for automated building and testing of your app, you must first set up a build server, add a build controller and a few build agents, and finally designate a drop folder. If you have a small start-up team working on a new project, you can probably deploy all these build system components on a single computer in a few minutes. As your team and your code base grow, you can expand your build system incrementally, with relative ease.
If you work on a small team with an on-premises Team Foundation Server, consider this topology:
build 1
Because build agents perform the processor-intensive work on a separate machine, they do not affect the performance of the application tier when builds are run.
You could also run the build controller on the dedicated build server. However, the topology in the illustration has the advantage of making build system changes less disruptive, such as when you must repair or replace the build server.
As the size of your team and your code base increases, you can incrementally add resources to meet your requirements. For example, you could add an additional controller and build agents.
build2
The presence of Build Controller A on the same machine as the application tier is generally not a problem from a processor standpoint. However, you might move the build controller to another server because of the memory pressure and attack surface issues mentioned previously.
By using multiple build servers, you can dedicate each server to a different purpose, as described in the following examples:

  • A build server on a high-performance computer dedicated to build agents that process continuous integration or gated check-in builds. The team needs these kinds of builds—especially gated check-in builds—to run quickly so that their work is not held up waiting for a build.
  • A build server dedicated to nightly scheduled BVT builds that require a lot of time to run processes such as large test runs and code analysis.
  • A build server prepared and dedicated to specialized tasks such as building and testing a Windows Store app.
In scenarios such as these you can apply tags to specialized build agents and then constrain your build definitions to use only build agents with the correct set of tags.
The following build system topology example could support an enterprise-level software effort.
build3

 
Each team project collection must have its own build controller, as shown in above. Notice how this topology isolates the build servers. Team members who work on Team Project Collection A can use only the build agents that Build Controller A controls. This constraint could be useful in situations where you need to tightly control access to more sensitive intellectual property.
Microsoft Azure has introduced a new service called Build and Deployment. This service uses hosted agents and provides a wider range of capabilities because it will also allow you to deploy software using the Release Management capabilities Microsoft is developing in Visual Studio Online. Authoring Release Management pipelines will be sold separately (per user) and Microsoft will begin previewing this aspect of the service soon. Currently this requires an MSDN subscription (Visual Studio Test Professional with MSDN, MSDN Platforms, or Visual Studio Enterprise with MSDN) because you need to download and install the Release Management client in order to configure release pipelines.
For more information on the new service and pricing, visit here.
If you’d like more information on Azure DevOps, contact us at Perficient and one of our 28 certified Azure consultants can help you envision your complete Azure powered ALM solution.

]]>
https://blogs.perficient.com/2015/11/13/azure-devops-scale-out-your-build-system/feed/ 0 225044