Perficient Portal Solutions Blog

Subscribe to RSS feed


Mark Polly

Twitter markapolly

Posts by this author: RSS

The Ideal Length of every Tweet, Facebook Post and Headline

I think every writer at one time or another has thought about how long is too long for a post, tweet or headline.

WordPress Title Length

As I typed my headline into WordPress, it was kind enough to tell me that my headline is 59 of 65 characters. I never understood if WordPress thought 65 was the max or the ideal size for my headline.

Reading through Zite (soon to be Flipboard) today, I came across this article by Kevan Lee at Fast Company: The Proven Ideal Length of Every Tweet, Facebook Post, and Headline Online.  Mr. Lee did the research to find out best practices and here is what he came up with.


100 Characters.  This is according to Twitter’s best practices.  Twitter found that there was spike in retweets for tweets between 71 and 100 characters. So if you want your tweets sent around by others, don’t take up all 140 available characters and don’t send short tweets.

Facebook Posts

Remember seeing a Facebook post that takes up the entire screen?  Unlike Twitter, the ideal Facebook post size is only 40 characters!  What?  You have so much more room to type in Facebook.

Ideal Facebook Post Size

Jeff Bullas found that posts of 40 characters received 86% higher engagement. Since Bullas’ sample of Facebook posts of 40 characters was a small sample, Facebook suggests 80 characters or less is good too. At 80 characters, Bullas found those posts has 66% higher engagement.

Google+ Headline

On Google, you have a headline and the body of your message.  If people only look at headlines, how long should they be?  How about 60 characters or less!  So maybe WordPress’ suggestion of 65 or less is not so bad.   Why 60 for Google?  Demian Farnworth found that more than 60 characters will likely split your headline into two lines in Ideal Google+ Headline SizeGoogle+.  Mr. Farnworth says that if you can’t get your headline under 60 characters, then make sure your first sentence draws the reader in quickly.


So what about headlines in general.  This blog post has a headline that WordPress suggested I limit to 65 characters or less.  According to Mr. Lee, 6 words is the ideal length.  Rats – his headline and my copy of it are both more than 6 words.  This number comes from research done by KISSMetrics.

What about Blog Post Length?

According to WordPress my post is 367 words long right now.  Mr. Lee says that my post is too short!  The ideal post length – according to Medium – is 7 minutes.  That is, people pay attention for about 7 minutes.  Anything longer and they stop reading.  It turns out that 7 minutes is about 1,600 words long, or about 4 times the size of this blog post.  Ideal Post Size

I hope you enjoyed reading this since I kept it under 7 minutes.  Have a look at all the links I included because they all have additional information that you may find interesting.  If you made it this far into my post, tweet a link, post a message on Facebook or leave a comment to let me know what you think about ideal tweet, post and headline size.


5 Key Strategies for Omni-Channel Marketing

IBM recently conducted a webinar titled “5 Key Strategies for Omni-Channel Marketing” in which they discussed the following strategies:

  1. Collect data that helps create customer profiles
  2. Analyze that customer data to find actionable insights
  3. Decide how to allocate your budget across the right channels to reach the right audiences
  4. Manage the interactions with customers across the channels
  5. Optimize your messages, offers, and capture reactions to feed the data collection processibmemm

Within each of the overall strategies, IBM did a good job of providing details about how you go about implementing the strategies.  I’ll briefly talk about each strategy below:

To collect data, you have to look at all the sources of data about customers and their behaviors to which you have access. Sometimes you will want to pull this data into a central datamart or data warehouse.  Broadly, these sources can be broken down into:

  • Web Analytics across your digital channels, including paid, owned and earned channels.  You should also be capturing analytics from your mobile apps if you have any.  You can also use attribution models to help you understand the kinds of data to collect.
  • Offline data comes from CRM, catalog inquiries and call center activity.  We think you should be using online tools for CRM and call centers and capture data about customer interactions automatically.  You might need to convert some of this data into customer profile information. For example, if a customer calls your help line four times in a month, you may count that person as highly engaged versus somebody that called once in the last year.
  • Real-time data includes items such as shopping cart activity, social media content, and intra-session data.

Now that you have collected data (often massive amounts of data), it is time to analyze it by building out a customer profile.  Sometimes this is called a 360 view of the customer.  Data collected has to be converted into meaningful information about the customer to build this profile.  Here are some of the profile information we want to analyze:

  • What behaviors does the customer exhibit?  How frequently do they interact with us? What purchasing stage are they in?
  • What are the demographics we can discern about the customer?  What gender, age, sex, income category, etc?
  • How about the customers technographics? Are they using a mobile device?  What size screen do they use?  Do they use a Mac vs Windows?
  • Does the customer have preferences in the form of a preferred channel, contact time, etc?
  • What interests does the customer have?

In the analysis phase, we want to discover or uncover insights about our customers.  We can use this information to develop key segments and personas, so can target our customers with the right messages.

After developing segments and personas, it is time to decide how to allocate your marketing budget to reach the right audience.  Here you want to plan how deliver a consistent message across the multiple channels you’ve identified in your customer profiles.  At the same time, you may need to plan how to customize your message for the persona you are going after.  Finally, you may need to develop marketing triggers to fire off a message at the right time for your audience.

If you’ve made it this far, you probably realize that you have to manage this data and these processes.  Marketing automation tools are really helpful here as they can automate the process of getting your message at the right time (say cart abandonment) to the right persona or segment.

Undoubtably, optimizing this whole lifecycle is a major undertaking.  The amount of data being collected about your customers can be overwhelming.  Reacting to market events requires agility.  It takes times to see and understand changes  in your customers’ changing demographics or technographics.  How quick was mobile adoption?

The good news is that major vendors like IBM have been developing marketing automation software over the past several years.  To learn more about these 5 strategies, you can watch a video made from the webinar.

Adobe Summit: Designing an integrated customer profile

Matthew Rawding, a Consulting Manager with Adobe. talked about how to use Adobe Campaign to create an integrated customer profile.  So what is an integrated customer profile?

An integrated customer profile is a main pillar of Adobe Campaign.  Also included in Campaign are targeted segmentation, visual campaign orchestration, cross-channel execution, real time interaction management, and operational reporting.

The goal is to build the most comprehensive view of a customer possible based on the following sets of data.

  • Contact Information
  • Sociodemographic data
  • Computed Data – compute additional attributes based on other data
  • Explicit customer data – data provided by the customer as in a customer preference
  • Implicit customer data – data we gather
  • Scoring attributes

A key message here is: The more you can gather and manage data, the fewer guesses you have to make as a marketer.

Adobe Campaign provides a way to collect and manage all this data about customers.  This allows you to market directly to customers by knowing this much about each customer.

You want to have conversations with customers, not just communications.  This means listening to what the customers are saying through explicit and implicit data.  In Campaign, you can use this data to target specific customers and target the specific communications methods they prefer. Workflows all you to split a campaign across devices, so if one person prefers email they get an email, while another customer that prefers phone, gets a phone call.

Data aggregation is important. You can load all data into one DB. Or you can map datasources based on common keys.  Or you can use a hybrid model of collecting data in one place and mapping that data to sources that can’t be collected.

Matthew provided a 9-point checklist for building an integrated customer profile.  The first 5 steps are all about planning.

  1. Define Business Need
  2. Identify data sources
  3. Located Other Sources
  4. Invest in Data Governance
  5. Plan out your Data Ecosystem
  6. Implementation
  7. Automation
  8. Measurement
  9. Maintenance

The process above is iterative. You periodically have to go through iterations of some or all of the steps.

In Adobe Campaign, they have a very recipient oriented data model. This data includes data from delivery logs, tracking logs, proposition logs, survey logs and sales transactions.  There is often a need to include loyalty data, but you can map this data into the Campaign.

Campaign also includes a data loading workflow which automates the importation of data.  This sounds like a typical ETL process.

De-duplication is an important and complicated part of the process.  De-duplication rules need to be created and monitored. Campaigns uses a score-based matching system to do de-duping. This allows you to assign a base rule followed by the scoring rules.  Scoring rule allows you to add weights to specific conditions.  Totaled together, the base match and score rule results indicated a matched record.  Campaign calls this an Enrichment activity in the workflow.

After you have the de-duplication rules, you can also set Merge rules to the system which data fields to keep from which duplicate record.


Adobe Summit: New Video Analytics in Adobe Analytics

Scott Smith presented a session on video analytics.  Analytics regarding video has been rough to achieve.  There have been few standards and the complexity has been high.

Abode has introduced a Video Heartbeat Tracking library where they hope to simplify implementation of video analytics.  They also want to introduce stability into process and help you understand more about your video usage.  This is becoming more and more important as video consumption moves from traditional TV to all of our connected devices.

Challenges we all face:

  • Granularity vs Cost – how many server calls am I going to make to track usage vs the cost to make all these calls. RIght now companies send tracking data at quartiles of the video.  If a video is short, like 5 minutes, the quartiles give you data back every minute, which is OK. As videos get longer, the quartiles get longer and you loose information.
  • Different players and player vendors – flash, html 5, third party
  • Measuring across devices and platforms – measuring in set top boxes, AppleTV, Roku, etc.
  • Syndicated content – your videos get syndicated on many different sites
  • Deciding what metrics to track
  • Understanding how video is impacting business (ROI, KPIs, etc.)
  • Video types – live streaming, video on demand

Heartbeat – what is it.  Heartbeat is a ping that lets us know you are still watching a video.  Adobe Analytics now has a video heartbeat tracking capability.   Data contained in a heart beat includes: environment, event info, asset information, stream information and user information.  There is also milestone tracking which Heartbeat is the new method.

During playback, a heartbeat is sent every 10 seconds.  These go to a pre-aggregation layer.  When the video is complete or abandoned, final time spent and completion metrics are sent to Adobe Analytics.  By going to 10 seconds, you get improved granularity of time spent tracking and you can also get real-time data. 

How does this affect Ad tracking?  Ads are tracked the same way – through the Heartbeat.  Adobe is working with Ad servers to ingest meta data from those servers.  New metrics that are available include impressions, duplicate views, bounce rate, average ads per video and time spend on ads. These metrics help you optimize your ad recipe.  If you have a pre-roll, you can use the analytics to determine the effect on viewing the content based on the pre-roll ad time.

How does this help with Standardization?  First the 10 sec heartbeat becomes a standard.  Adobe is defining a set of reserved variables so everyone has access to the same data. There are 7 core video variables and 7 ad variables.  You can then add custom variables on top of these (for which you have to pay additional money).

With these standards, Adobe will also be able to show benchmarks.  By taking out customer specific data, the standardization allows Adobe to compare video analytics across industries, content type, etc.

Adobe is also including real-time reporting for video.  You can pick a few measures and within Adobe Analytics see trends as they happen in real-time.  Later this year, Adobe is going to report on live events and video on demand.

Pricing in the past has been based on the number of server calls.  Adobe is moving to a stream-based pricing model.  So you pay one price for a stream.  In the stream you can track ads as well as content for the same price.  To estimate the cost, all you need is to estimate the number of streams you want to track.

Right now this new Heatbeat is available on ActionScript, Javascript, IOS and Android platforms.  Other platforms are coming later this year.

Adobe Summit: Reimagining Digital Marketing in Financial Services

Christopher Young from Adobe kicked off this session by talking about these top challenges in Financial Services:

  1. Compliance – 84%
  2. Data Security & Privacy – 82%
  3. Difficulty Personalizing – 80%
  4. Fragmented Data – 78%
  5. Incompatible Marketing Solutions – 74%

Three takeaways:

  1. You are only as good as your analytics
  2. Tom improve the experience, understand your customer
  3. Don’t go it alone, align the right internal partners (legal, compliance, IT, marketing)

Nicole Sturgill from CEB TowerGroup presented results from surveys they’ve conducted.  The first survey showed the top priorities:

1. Develop a single view of the customer through integrated data

  • 80% said this is imporant
  • only 32% were confident they could do this

2. Deliver a cost-effective & consistent customer experience across all channels

  • 74% viewed this as important
  • but only 23% were confident they could deliver

3. Manage & organize data for actionable business analytics & decision making.

  • 66% feel this is important
  • but only 30% were confident they could get there

The most important analytics functions CEOs are focused on:

  • Risk Management – 50%
  • Improving product or service profitability – 44%
  • New Market identification and segmentation – 42%
  • Developing a corporate a business strategy 40%
  • Forecasting demand 40%
  • Product or service development 49%

Farrel Hudzik from Accenture Interactive spoke about what their clients are asking for.  Financial services companies are thinking about a holistic set of problems.  They are concerned with destructers – like Paypal, Square, etc who take business away. They are also concerned with disrupters like Bitcoin, etc.


These firms are now starting to think about how they can partner with other companies to provide services to their customers and get better at analytics.

The best companies at digital marketing start at the top and make a concerted effort to pull together siloed organizations and data.

In a quick poll of the audience, the vast majority of people said they are just getting started in their journey to digital transformation.

Deepak Nair from US Bank talked about digital maturity.  He started by talking about how Facebook started as targeted for students and quickly morphed into this mega marketing platform.  One reason for this transformation is that Facebook had good data about its users and understands them.

USBank started on their journey with a reporting effort – they need to collect the right kind of data and educate their people on what is actionable data.

Next, they went to the analyze stage.  They began identifying problems, causes and next steps.  Deepak called this their maturing stage.

In the leading stage, they began predicting with the data.  They started identifying trends and acting on them.

For data quality, Deepak provides these goals:

  • Build a rich analytics platform to combine online and offline data
  • Create a foundational testing and learning mindset
  • Innovate to solve business problems & increase efficiency.

When it comes to marketing,

  • “Your campaigns are only as good as your data/analytics.”
  • The right leadership and organizational alignment is critical

US Bank achieved the following results:

  • Identified new opportunities
  • Empowered their internal business partners
  • Increased the capability of the tools resulting in better ROI
  • Established Quantifiable results and quick wins
  • Able to build a better customer experience

What should you do:

  • Make a plan to get the data you need.  But ask what data do you need, where is it, what are you going to do with it
  • Unify that data to complete the view of the customer.  Again prioritize what data you need and how to get it.
  • Build a coalitoin of the enlightened by eduating on what you are trying to accomplish, and paint a picture of shared success.

Adobe Summit: Top new features in Adobe Experience Manager WCM

Cedric Huesler @keepthebyte is a Product Manager at Adobe and works on the web content management aspect of AEM.  Cedric demoed 10 new features coming in AEM 6.0, which was announced yesterday.  A new major version (6.0 vs 5.6) for AEM indicates a major architectural change in the product.  Adobe expects the new architecture to work seemlessy with prior versions, but will allow for more scalability, etc.

I’ve included a video of Cedric talking about new features in AEM 6.0.  Cedric demo’d the new features using a tablet to show how they’ve fully enabled the touch interface for the authoring environment.

The first new feature deals with language translation.  This release does a better job of managing translations.  A site contains a language master for the content and then Live Copy is used to created the translated content for each country.  A new References feature allows you too see where a content item is used and where it has been translated.  The translation workflow goes out to a translation vendor through an API that Adobe has built into AEM. (more…)

Adobe Summit: Attribution and Media Mix Algorithms

There is a lot of buzz around attribution.  In this session, Sid Shah from Adobe spoke about the following topics as it relates to Attribution and your Media Mix by looking at the following three items:

  • Choose the right model
  • Attribution does not equal Media Mix
  • Media mix models should be realistic and predictive

For attribution, you can’t rely on the Last Click.  49% of conversions happen after multiple clicks so you must analyze the entire path to be predictive. There are many different Heuristic Attribution Models:

  • Last Click
  • First Click
  • Equal
  • Last More
  • First More
  • Custom

Mathematically all these models are equally right an equally wrong.

Here are a couple complex models:

  • Bottom’s Up Attribution – the theory here is you start with two different methods and see what converts.  For example,  you start with a search and display an impression and measure the conversions. Then you do the same without the impression and see what is the conversion rate.  From here you can calculate probabilities of conversion based on the conversion paths.  Unfortunately, this method is problematic when it comes to scaling up the testing.
  • Shapley Top Down Attribution – Roy Shapley won a nobel prize for Game Theory.  IMG_1930Here you start with a goal and see how each marketing/sales effort contributes to the response.  Here you have to account for extraneous factors like weather, pricing, the economy and others that may affect the measures.  So you want to see if you increase one marketing effort by 5%, how does that affect the conversion rate.  If all channels and efforts contribute equally, then attribution is a simple redistribution of sales based on the fraction of impressions. But if there is unequal persuasion in the various channels, you can then find the attribution of each channel and develop probablities for the effect of a channel on the overall outcome.

Note that attribution looks back in time and does not provide you a future view.  This is where the Media Mix comes into play.  Most marketing response usually follows a power-law model.  However, the models can become complex very quickly because there are multiple marketing inputs, macro influences, and mutliple outcomes that have to be part of the model.  Once you have a good model, you can then start to use the Shapley Attribution to gain insights with which to do Media Mix planning.

Sid presented a case study that showed several media channels where email and display were most effective at the point in time.  After going through the attribution process to see where channels affected leads and revenue, we find that display is contributing more to leads and less to revenue.  PR was attributed to a small impact on leads, but had a large impact on revenue.  This data helps us see where we gained good outputs based on historical data.  But Sid warns not to use this data for future planning.

Using the insights, we can optimize the market spend inflight on the media mix that contributes most to our goals. However the optimum mix may not be possible due to other constraints.  In the example, the optimization process said to increase spend on PR by 70%.  However the company could not hire enough PR people quick enough to make this optimization possible.

Adobe has a Market Mix Planner product that will help in the planning process.




Adobe Summit: Understanding customer insight without getting creepy

In a different session one of the speakers said there is a fine line between understanding customer sentiment and being creepy. To get insight, you have to monitor what they say, what the do on your site and others’. But you don’t want to go too far and cross that creepy line.

In this session Craig Stoe and Andrew Bolander spoke about this topic. You can use Adobe Social to gain customer insight. These insights should lead to developing stronger relationships with customers and delivering more consistent experiences.


What is social data?
- engagement data is what is going on with the customer
- listening data are brand relevant keywords, sentiment, influencers
- attribution data is what we can learn about the customer personal attributes

Start with discovery. This is where you mine data about customers. Next is explore. Here you start to look for more engagement data. At the buy stage, customers express buying signals and you need to pick up on these to make offers. All this data can lead to better engagement.

A social profile helps you collect all this data so you have a better view of the customer. Adobe has developed a scoring algorithm that can be assigned. This consists of measuring User Class, Supporter index, Buzz and User Distribution. These measures lead to a composite score that can be used to target customers.

Andrew presented 5 steps to implement and use the social profile.
1. Listen
2. Prioritize
3. Engage properly
4. Gather insights
5. Share insights

He presented a live demo of Adobe Social and how to build and use the social profile through tweets from the audience.

For listening, you create listening rules that help you filter across the social platforms, hashtags, and attributes.

In prioritization, you can set up automation rules to route content around the organization. You can also apply segments and tags to results of the rules.

When it comes time to engage, you use Unified Moderation, which places multiple platforms under one screen so you can moderate multiple systems at the same time. Moderators can claim content, retweet, respond, and close out the item as handled. By claiming content, you prevent others from working on ending up with duplicate responses.

The social profile is connected to the moderation screen and you can review a users profile attributes. The system can also pull together different accounts into one profile. So one profile could link you twitter data with your Facebook data.

The social profile provides the customer insights, including emotion scores, sentiment scores, tracking of previous interactions and more.

As a final piece, Andrew showed how Adobe Social integrates and shares with the Adobe Marketing Cloud. For example, if a user logs into your AEM site using a Facebook login, their actions on the site can contribute to their social profile.

Adobe Summit: Top new DAM features for Adobe Experience Manager

By the size of the line to get into this session, I’d say that Digital Asset Management is a hot topic. Elliot Sedegah and Greg Klebus from Adobe shared the top new DAM features in the AEM.


First, DAM in Adobe is now called Adobe Experience Manager Assets.

Managing digital assets can be a tedious process and the time lag involved delays web site launch. Too often assets are not integrated into social media channels consistently.

Assets range from the Creative Cloud to the Marketing Cloud, so from creating assets to managing and sharing assets to multichannel delivery.

How can we shorten the time to delivery? In the Marketing Cloud, we now have Projects that allows us to manage the life cycle of a project. Inside the Project there is a dashboard that shows the assets, tasks, people, experiences, brand guidelines and workflows associated with the project.

AEM and Marketing Cloud are now connected so assets in AEM are available in both places. Since Creative Cloud and Marketing Cloud are connected, you share assets among AEM, Marketing and Creative Clouds. (more…)

Adobe Summit: Building a social listening program with Adobe Social

I made it to Adobe Summit 2014 and my first session is How to build a social program using Adobe Social. This is a technical session presented by Carmen Sutter, Adobe’s Product Manager for Adobe Social and Greg Greenstreet VP Engineering at Gnip.

550 million tweets were sent in one day. Yet, there is almost no organization to those tweets. You can’t easily go find tweets about your brand, posted by your employees etc. Add in Facebook, Instagram, Google+ and others you can see that mining social data is not easy. But you must mine this data if you want to understand what people are saying about you.

To make sense of all this data, you want to “cast the right size net for your social campaign”. In other words you want to filter all the data using various techniques. The most basic technique is to use boolean And, Or, and Negation. Gnip uses this kind of filtering as the starting point to filter the huge amount of data coming from the social platforms. Adobe Social then takes that filtered data to further refine the data.

In addition to Boolean, you can also filter on geo-tagging. However only 1-2% of users actually make it easy by supplying or using geo-encoding when tweeting. So Gnip uses other techniques to grab geo-information.

Using public APIs you can also filter on language, hashtags and many other attributes of social data.

Next, you have to figure out what to do with the filtered data. Look at the goals of what you are trying to accomplish. This will affect the workflows you set up for monitoring and responding.

In the session the presenters talked about monitoring the Olympics Twitter feed. So they’ve already performed one level of filtering. Then they looked at specific hashtags, event names, event types and more.