Want more visibility on Pinterest? Wondering how to use Pinterest search ads to promote your products and services? In this article, you’ll discover how to use Pinterest search ads to reach your ideal customers. What Are Pinterest Search Ads? Pinterest is a search and discovery tool. Users can actively search for specific ideas, products, and [...]
The post How to Use Pinterest Search Ads appeared first on Social Media Examiner.
Need to do more marketing with video? Looking for video tools that are easy to use? In this article, you’ll discover three tools to quickly and easily create polished, eye-catching videos for social media. #1: Produce Video Quickly With Biteable Biteable lets you quickly create video ads, animations, and presentations. It offers more than 60 [...]
The post 3 Low-Cost Video Tools for Social Media Marketers appeared first on Social Media Examiner.
When I saw Chris Penn speak at our 2018 Next10x event, I knew I wanted to get together with him for a video session to discuss machine learning and AI. We did that recently, and the videos below are the result. Each video is accompanied by a transcript if you prefer to read.
We covered seven major topics in our discussions:
What Are Machine Learning and AI?
How Will Machine Learning and AI Change Our World?
Google and AI: RankBrain
Google and AI: Beyond RankBrain
The Rise of Smart Devices
Predictive Analysis and Content
Real Applications of Machine Learning and AI Today
As you can see, we covered a wide range of territory. Throughout these conversations, we’ll give you a solid introductory primer into machine learning and AI, as well as an understanding of how companies and individuals are applying them in the world of search. I hope you enjoy these conversations as much as I did.
Conversation #1: What are Machine Learning and AI?
Eric: Chris, my first question for you is, “Can you just tell the viewers a bit about machine learning and AI, first of all, and how they’re different?” and then, “Where are they today?”
Chris: Gotcha. AI is a big broad umbrella term. It basically means getting computers to do things humans do with our intelligence naturally. So you can see me, and if you’re watching this, you can see what’s going on. You’re using vision. If you’re hearing the words I’m saying and it doesn’t sound just like noise, then you’re using natural language processing.
We learn these things instinctively and through our own training as we grow up. But, we’re trying to teach computers to do those things. Now under that umbrella, the foundation of today’s AI and machine learning is all statistics—it’s all math, right? So, if you didn’t like math, sorry.
The good news is software’s helping with a lot of that. Statistics and probability are really the heart of artificial intelligence. With those individual statistical techniques, we build them into what are called algorithms—repeatable processes. Everybody uses algorithms all day long. When you get dressed in the morning, you probably have a sequence of things that you do every day that is predictable. And so that’s the algorithm.
You get into the interesting territories when you give computers these algorithms and you give them data and you say, “Hey, you decide what algorithms to use to make the data reach a conclusion of some kind.” And that’s what we call machine learning where the computer now, instead of we write the software and the computer processes the data, now we provide the data and the computer writes its own software and then comes out with an outcome.
Now, if you were to take machine learning as a layer of a pancake,—a set of algorithms,— and you were to start stacking them on top of each other like Lego blocks, where the data moves from one block to the next, that’s what’s called deep learning. It’s called deep because it can be hundreds of layers deep. That gets computers much closer to either human level or beyond human level capabilities.
Deep learning is like a stack of machine learning Lego bricks, each brick passing information to the others.Click To Tweet
So, when we’re talking about AI, particularly in terms of marketing and search and things like that, we are talking about computers being able to think like humans, create outcomes that humans want, and optimize for those outcomes. A lot of what we’re going to be talking about today deals with how that impacts things like search.
Eric: Absolutely. One of my favorite examples is when Google’s DeepMind subsidiary cracked the code on the game of Go and taught it to beat the world champion. That was an intense machine-learning exercise right there.
Chris: Yes, absolutely.
END OF PART 1
Conversation #2: How Will Machine Learning and AI Change Our World?
Eric: For the next question, can you talk a little bit about how this is going to impact our world, in terms of the types of jobs and the kinds of things that will change in our environment, overall?
Chris: In the future, there will be two kinds of jobs. You will manage the machines or the machines will manage you. And that’s pretty much the future for everything. If there’s a process or a task that you do that is repetitive, a machine will do it. At some point, a machine will do it because it’s really not worth a human copying and pasting the same thing over and over again.
In the future there will be just two kinds of jobs: you will manage machines or machines will manage you.Click To Tweet
Think about, for example, in the world of search marketing. What are some of the things that we would do in search marketing? We do stuff like keyword scoring, keyword analysis, and text analysis. All that is stuff machines can do. You don’t need a human to do that anymore.
Another one of the things I think is relevant is if you ever Googled for an Instagram template or an SEO checklist, things like that, right? If you use a template to do your work today, a machine’s going to do it without you tomorrow because you just don’t need to be doing those things anymore. So that’s a big part of the future.
And the most important thing, I think, is that from a marketing and communications perspective, marketing becomes truly one-to-one. We can’t scale. You and I can’t individually talk to a million people every day. You and I are having this conversation here. We’re having a one-to-one conversation, but we can’t do this at scale. We don’t scale. There’s just not enough hours in the day.
But an AI can actually do that and have a meaningful interaction with somebody on a one-to-one basis. Whether you’re searching for something or talking to a voice assistant, you can have these interactions one-to-one and the machines can remember who you are.
For one of my favorite examples, go to Google and look up Watson Conversational Ads. It’s an IBM product. Disclosure: I’m an IBM champion, so they send me clothes to wear.
You can talk to the ad. It’ll ask like, “What’s your favorite ingredient?” And I’ll type in “Sriracha” and they’ll come up with a recipe, on the fly, that’s just for you based on the time, the weather, your search history, and things like that. It’s your recipe, but it also warns you these are not kitchen tested—use common sense. But that’s one-to-one marketing and that’s how this is going to impact everything going forward.
END OF PART 2
Conversation #3: Google and AI: RankBrain
Chris: I want to get back to something that we were talking about in terms of search. At the Next10x conference, we were talking a bit about RankBrain and how Google is using AI. What have you seen the big search engines doing with AI and machine learning as it impacts marketers?
Eric: I’m glad you started with RankBrain because there’s a little bit of a myth out there and Google picked a really unfortunate name for it. But the original RankBrain algorithm is what I call a “sparse data algorithm,” and it was really about providing better answers for the kinds of search queries that users enter for which they don’t have data.
So the way that worked is it would actually look at historical search queries, especially on long-tail queries. It might be five, six, seven words long, or even longer queries. Nobody had ever done these queries before, but they could do what they call a similarity vector analysis where they look at the vector for the query entered by the user based on the words.
They might have a similar query where the vector, when they draw it, is really similar. So mathematically, they’re able to determine that these queries are extremely similar. This is building on what you said a moment ago, just doing the statistical analysis.
Looking at those two very similar queries, Google could then actually see how people responded to the other query. Do they not click on the first result? Do they ignore the e-commerce results? Do they click on the informational result? And based on that, they can tune how they give you the results for the query you actually entered. This is where RankBrain started years ago.
The interesting thing is, this got confounded a little bit more because Google made the statement that it was the number-three ranking factor in the Google algorithm. By the way, the first two, they said, were content and links.
Chris: Huge surprise.
Eric: Which is good. The world hasn’t been turned completely upside down yet. The reality is we have to remember, 70% of all search is in the long-tail. So if RankBrain operates primarily in the long-tail, it can actually have a very large impact but not change ranking for higher volume queries at all, which is basically what they tend to say about it.
70% of search takes place in the long tail, and that's where RankBrain comes in.Click To Tweet
Chris: But here’s the thing. The way we search is radically changing. So that inflates RankBrain’s importance. Today, when I talk to Google Assistant, I don’t say “best SEO firm.” I don’t speak in these short-clipped phrases. I’ll say, “Hey, Google, what’s the best SEO firm in Framingham, Massachusetts?” Right? It’s a very long-tail query. So does that mean that RankBrain is processing a lot more of the voice interface and the voice searches?
Eric: I think that’s likely the case. As you know, voice queries tend to be much more natural language and much longer, and as a result…Yes, it’s going to trigger RankBrain even more.
END OF PART 3
Conversation #4: Google and AI: Beyond RankBrain
Eric: I happen to think RankBrain is evolving.
Chris: Into what?
Eric: What we’re seeing now is this idea of comparative analysis and being able to look at query histories using machine learning and AI, and that is particularly interesting. It allows them to try out the idea of experimenting. Let me replay it briefly. RankBrain was looking at past historical query results and learning from them to tweak your results.
Chris: Based on a vector word analysis.
Eric: Now, let’s make a simple modification to that concept and actually run an algorithm where we test certain kinds of listings, see how they perform, and compare them to tests of other kinds of listings. I’m again looking at historical results, but rather than going in the databank and hoping that I have a related phrase that I’ve done something with, I’m going out of my way to dynamically test scenarios.
Chris: Well, yes, we know they do that. They do that with Markov chains in the Attribution 360 product. It’s built right in and they do hundreds of millions of comparisons of all your data based on your past data.
Eric: I think they’re being much more deliberative about that now in what they’re doing with search results.
Above is a screenshot of the search results for the phrase “digital cameras” from February of 2018. What you’re seeing is that there are two reviews results and two e-commerce results. By reviews results, I mean pages giving reviews of lots of different digital cameras.
Fast forward to May of 2018, and it’s changed dramatically. Now we have three e-commerce results, no review page results, and Wikipedia. I’ve seen this for many, many different kinds of SERPs (Search Engine Result Pages) in a way that I’ve never seen in Google before. It’s happening more dynamically. So it’s my conjecture—I have no confirmation, to be fair—but it really looks to me like they’re deliberately testing scenarios to better determine user intent.
It looks like Google is testing user intent assumptions and adjusting search results in response to the tests. Click To Tweet
Chris: How do you get around the issue of personalization in the results? When you’re advising clients, do you provide something like, “Here’s the generic, not logged-in result,” and then here’s 12 or 15 personas of standard business users or standard homeowners to show how the results will vary from person to person?
Eric: It’s actually hard to do specific SEO work around personalization. But really, it ultimately all gets back to user intent, and how well your content matches up with user intent. This is something that I think a lot of businesses are dramatically under-invested in because when someone comes to your web page, they’re looking for something. And it might not be just the top-level product on your web page, but all the ancillary needs that they have related to that.
Chris: Do I need a digital camera and…
Eric: Well, I was going to say an SD card, right? I almost said film. I was dating myself terribly there.
Chris: No, you could be retro.
Eric: Well, I could be retro. That would have been an embarrassment. Oh, wait a minute. I did say it. Yes, you have other needs, and you have other things that you’re looking for. So you have to design your content to meet that broader range of needs.
And this, I actually think, is the thing that helps the personalization part of the algorithm work in your favor, because if you’re creating the content that they engage with initially, because you do a good job of putting out there that you’re addressing a broad range of needs, then you’re putting yourself in the situation where the personalization algorithms work in your favor.
END OF PART 4
Conversation #5: The Rise of Smart Devices
Chris: Now, let’s talk a bit about some of the smart devices like Google Home and Alexa and others in that world. How should we be optimizing for these devices, for these much longer tail searches?
Intent is a big part of it because obviously, if I don’t have to think about what I’m typing, I would say, “Hey, what’s the best SEO firm in Framingham, Massachusetts that accepts B2B clients?” That’s a very long search term, and there’s a lot more rich intent in there than “best SEO firm.” The intent is unclear.
So how do we optimize to take advantage of all these different types of intents that people are going to physically speak into their smart home devices, their watches, even people talking to their refrigerators now?
Eric: Absolutely—my car, right?
Eric: And my watch; I’ve got them all, all those devices. I think one of the big things people have to realize is when you’re dealing with Alexa or the Google Assistant running on Google Home or something running on your smartphone, and you use a voice query and you get a voice response, you get one answer. You know this, right? This is the big thing.
Most of the time, the great majority at that time, when it’s a Google Assistant answer, they’re drawing that from what they serve as featured snippets in the regular search results. So the big thing to do is learn how to generate featured snippets.
But let’s back up and look at this from a Google perspective and how they’re thinking about it. It used to be that when they served regular search results, if the first answer in the search results wasn’t perfect but the user got what they wanted in position two or three, that was actually still a good result for the user. They don’t have that opportunity in the voice environment. They only get one answer. I happen to think that they’re investing in enormous amount of machine learning…
Eric: …technology. What’s your take on that?
Chris: I think you’re absolutely right. And I think one of the things that marketers, in particular, are neglecting is the data they already have. So we’ve been doing a couple of projects, mining people’s CRM data, like the stuff that people call in or email in: “Hey, I’ve got a problem with this product or service.” If you mine that data and you pull out the way people are talking to you about your stuff on your website, that is rich search content to fulfill intent, right?
Mining data from your CRM and email using machine learning can yield new search-friendly content opportunities.Click To Tweet
Because you know when somebody searches for SD card class 10, what they’re really asking about is, “How do I have a card that doesn’t cause frame rate issues when I’m recording a video or setting up a security camera?” or things like that. And so, if you mine your CRM data, and you’ve got a whole pile of emails that say, “Hey, I’ve got jittery video,” now you can go back and reinfuse your content that’s public or search volume indexed with that intent to say, “My video’s stuttering.” “Okay. You need a Class 10 card,” and things like that.
I don’t see companies doing that. People are sitting on these years or decades of CRM data and they just let it sit out there and just cost money as storage cost and let it be a security risk, as opposed to saying, “Let’s use this to inform search and marketing and communications.”
END OF PART 5
Conversation #6: Predictive Analytics and Content
Chris: The other thing I don’t see people doing, or hardly any of, is predictive analytics. This is a problem that marketing automation software has made worse. People assume that everybody who is qualified to buy is ready to buy all the time, right? You’re the CEO of a company, right? So clearly, you’re qualified. You’re the decision-maker. So we’re just going to assume that you’re ready to buy.
Well, no. I mean, if you are a CEO, you’ll have ebbs and flows and things throughout the year, particularly if you’re publicly traded. You have a quarterly calendar you have to go by. And so by using predictive data, especially based on search data, which is reliable─ and well, people ask Google things they would never ask another human being out loud─ you would get a much better sense of when somebody’s doing something good.
I think part of the intent and part of the search results that we’re talking about is that people don’t take into account time. When is somebody searching for an SEO firm? When is somebody searching for a marketing firm? When is somebody searching for a new car? I would be completely surprised if Google did not take into account time in its results.
Predictive analytics can go beyond what people want to when they want it. Effective marketing shows up just when people are ready to buy.Click To Tweet
Eric: Yes, I agree and I think people are dramatically under-invested in content. I mentioned this earlier.
Here’s a case study with a search visibility chart pulled from SearchMetrics for a company that we happen to believe has about 15 full-time, knowledgeable content generators putting up over 100 articles a month on their site addressing specific questions and aspects of topics that users have in their market space.
When you look at this, it’s crazy, the traffic lift. They launched in May of 2016 and they’ve actually already achieved a dramatic search visibility by understanding what you were just talking about and investing in answering the real user questions.
END OF PART 6
Conversation #7: Real Applications of Machine Learning and AI Today
Eric: Why don’t you talk to us a little bit about how you guys are using machine learning in your business today.
Chris: It’s really three things.
Predictive analytics: when is something likely to happen or what drives something?
Text mining: understanding what’s in the data you already have. There’s so much data you’re sitting on. Please do something with it. Don’t just put it in a digital filing cabinet to rot forever.
Attribution analysis: the same technology Google uses, the same algorithms like Markov chains and Monte Carlo simulations, you can do on your laptop, though you won’t do it at a Google scale. But you can do it enough to do really good attribution analysis and get a very clean picture of what’s working. What’s really surprising is, in particular, search traffic and referral traffic are so under-weighted in most people’s attribution models because they just go with “last touch,” that if you do a full path analysis, I guarantee you’ll find you are under investing in search.
No matter what company you’re with, you’re under investing in searches with the way devices are going and with how social has changed to be all pay-to-play. Whatever your search budget is, just double it, because that’s where this stuff is going as the only way you can be found and not be spending large quantities of ad dollars.
Now, at your Next10x Conference, you mentioned that you would actually take Python courses and such. So what are you guys using in the AI realm?
Eric: Well, to be honest, at the beginning, it was just me trying to get my head around it. Being a geek, I have to go down into the detail before I can come back up and get my own sense of the bigger picture. So, I basically was just learning machine learning. I took the course from Andrew Ng, who’s Chief Scientist at Baidu. And then Geoffrey Hinton, who is directly involved in Google and machine learning out of the University of Toronto.
Where we’ve gone from there with it though is we’re really focusing a lot of energy on understanding how Google is using AI and machine learning. That’s really a big area for us because that actually puts us in a better situation to help our clients with it. And we have also done some dabbling in tools to improve content quality.
In particular, we have something that’s focused on processing user-generated content and automating that to, at this point, just reduce the need for human moderation by 80-90%. It’s a little hard to get to 100% with that.
Chris: Oh, yes, that’s true.
Eric: But if you can cut it down dramatically, then that’s actually a very high-value thing to do.
Chris: I’ll say. One other course that you should take a look at is Google’s crash course in machine learning, completely for free. It uses TensorFlow, as well as their hardware and their software. So if you wanted to literally get it from the horse’s mouth, it’s a completely free course. I’d encourage anyone to try it out.
END OF PART 7
If you’ve made your way down to this portion of the post, you have a definite interest in Machine Learning and AI. Watch this space for more content along these lines!
Christopher S. Penn is co-founder of Brain Trust Insights, a data analytics company focused on helping you make more money with your data, a co-founder of PodCamp with Chris Brogan, and co-host of the Marketing Over Coffee marketing podcast with John Wall. Learn more about him at his personal site: www.christopherspenn.com
Eric Enge is the founder of renowned, award-winning digital Stone Temple Consulting, and was its CEO until it was acquired by Perficient Digital, where Eric now serves as General Manager. He is the lead co-author of the bestselling The Art of SEO (now in its third edition from O’Reilly Media), and a sought-after keynote speaker, as well as a regular columnist for Search Engine Land. Eric’s groundbreaking studies have become industry standards, regularly cited in major publications.
Want a better way to report on and visualize your social media campaign data? Have you considered Google Data Studio? In this article, you’ll discover how to use Google Data Studio to create easy-to-update reports on your website, Facebook, or Instagram marketing. #1: Import Data Sources Into Google Data Studio Google Data Studio is a [...]
The post How to Use Google Data Studio to Report on Facebook Campaigns appeared first on Social Media Examiner.
You can’t manage what you don’t measure. But how do you know what your analytics should measure?
In this episode of our popular Here’s Why digital marketing video series, Eric Enge gives you four fundamental principles that should define your approach to analytics.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
Eric’s course for O’Reilly Media: Understanding Google Analytics –From Beginner to Advanced in One Day
See all of our Here’s Why Videos | Subscribe to our YouTube Channel
Mark: Eric, we often hear if you can’t measure it, you can’t improve it. So it would seem that analytics would be a critical concern for anyone doing digital marketing since that’s the way we measure.
Eric: So true, Mark, and yet so many marketers fail to take full advantage of the metrics and data that they collect on a regular basis.
Mark: Why do you think that is?
Eric: To me, the failure starts at a philosophical level; that is these marketers don’t have a well-defined approach to how they implement and use their analytics.
Mark: What would a winning analytics philosophy look like?
4 Fundamentals of Effective Analytics
Eric: I think it’s made up of four fundamental principles.
The first is to set your goals on tracking revenue as closely to direct measurement as you possibly can. Ideally, that would be a situation where you can directly attribute each dollar of revenue to a well-defined source or a distinct path through multiple sources. You should build your goals and attribution models in your analytics to get as close to that direct measurement as you can.
Of course, it isn’t always possible to get a direct measurement. No analytics program can see everything that contributes to a conversion. So, it’s more important to develop an attribution strategy.
By careful observation over time, you can get to know the relative proportions of conversions that come from each of your channels. Then calculate your average transaction values and use those to calculate an attribution distribution for your conversions that you can’t directly track.
If you can’t get direct measurement, see if you can approximate it. For example, if you know your average value per transaction and you can track total transactions, you can estimate revenue. Or if you know the average value per visitor, you can also estimate revenue that way. Then you could tie that back to channels and other dimensions to see what’s the most impactful one for your revenue.
For analytics, your goal should be to track revenue as directly as possible, and formulate reasonable attribution models where it's not.Click To Tweet
Mark: Okay. So what’s next in your analytics philosophy?
Eric: The second part of a winning analytic philosophy is developing a measurement plan. While revenue is your goal, for most businesses at least, and the most important thing to measure therefore, there are many stepping stones on the way to that goal and measuring them can be useful too.
Mark: What would be some elements you’d include in a measurement plan?
Eric: Here’s a measurement plan I created for my complete course on Google Analytics for O’Reilly Media.
Some of the things that I think worth building into a plan might be e-commerce sales; KPI’s such as conversion rate, revenue and average revenue per transaction or per day or per channel; dimensions such as channels, demographics, locations, device types and languages; whether it’s coming from new versus returning visitors, sessions, total sessions, top landing pages, pages per view.
These are all things that can go into a measurement plan. And what you want to do with your measurement plan is pick the metrics and dimensions that best fit your business goals, but be thoughtful about it upfront.
To do analytics right, you have to have a measurement plan.Click To Tweet
Mark: So, what about segmentation?
Eric: Funny you should ask, because my third principle is segment, segment, segment.
Segmentation helps you slice into your data in different ways that can reveal new insights. You should segment by channels, of course, but there are many other things that you can look at also: the location where users are coming from, the language they’re using, and device types that they’re using. Screen size is another area.
There are many different types of segments you can set up. You need to fit them to the needs of your business. You need to think that whole process through. So spend the time to experiment to learn what segments are interesting to you and which provide the most value.
Segmentation can help you figure out how different audiences are responding to your site. With this information, you can then tune your content to your various audiences.
Segmentation can help you figure out how different audiences are responding to your site.Click To Tweet
Mark: And the fourth and final principle?
Eric: Well, that would be not shortchanging your investment and getting your analytics set up right. I can’t overemphasize the importance of this one. If you do a bad job setting things up, then you’re going to get bad data in return. And that will lead to bad decisions for your business.
Mark: What would be some examples of implementation mistakes?
Eric: One would be not using UTM tracking codes on links from email or other channels. Not doing that properly will come through looking like direct traffic, for example, and would skew that data. If you don’t have these set up right, you won’t be able to correctly attribute your conversions, a key thing.
Another is not setting up cross-domain tracking. For example, let’s say you’re an e-commerce site, and your shopping cart actually is on a shopping service, a payment service of some kind. So the conversion actually happens on a different domain.
If you don’t have proper cross-domain tracking in this scenario, and you have traffic that comes in on your regular site, but the actual closing the transaction happens on this payment service, those conversions won’t be properly attributed to the right channel. So, this is a really big problem.
Another big issue is not setting up goals or not setting them up correctly. It’s important to identify real goals to set up such as conversions, obviously. But they also might be a call being initiated or a contact form request of some sort. Without goals, you aren’t able to track all your important conversion points.
And finally, and this one is very common, failing to use annotations to track when changes were made to the site that might affect visits, conversions, analytics, etc. Most analytics programs allow you to add dated annotations. They can be really useful when there’s some sudden drastic change in your metrics. An annotation might provide a clue that to what change you made that caused a problem on the site.
Mark: Thanks, Eric. Obviously, analytics is a complex topic. And if you’re like me you’re feeling like you don’t know enough about it. That’s why I want to recommend you check out Eric’s complete course in Google Analytics published by O’Reilly Media.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
See all of our Here’s Why Videos | Subscribe to our YouTube Channel
Cloudflare is one of the most common names when it comes to website protection, anti-DDoS gating, and content delivery via distributed network. Over the last decade, a lot of information – and misinformation – has been spread about the network. Some things people have said are false, and others may have been true but have changed in the years since. With so much to content with, it’s no wonder people research the answers before they consider buying in.
I’m going to go over one of the primary concerns with Cloudflare, which is whether or not it can potentially harm your ability to make sales or convert new customers. Before we begin, though, I’d like to talk about what this article is not.
This article is not an analysis of Cloudflare’s effect on SEO. For that, you want to turn to other authorities, at least until I get around to doing my own analysis. This case study is a pretty good one, for example.
I’m going to take a look at more direct issues with Cloudflare. Can it accidentally block legitimate traffic?
Can Cloudflare Block Legitimate Users?
I might as well not dance around it. Yes, Cloudflare can occasionally block legitimate users. However, it’s fairly unusual and it’s not likely to happen in an instance where it matters to a sale.
There are three instances where Cloudflare might be blocking something legitimate. The first is when they have enabled anti-DDoS mode and are aggressively filtering traffic. The second is when they’re blocking tools that legitimate users might want to use. The third is when the user is somehow malicious and unwitting. I’ll talk about each of these individually next.
The benefits of Cloudflare are strong enough that, even if it’s blocking a few customers, it’s generally worthwhile to use anyway. DDoS protection is just one benefit to their service in general; the CDN can be helpful for other reasons as well.
The first instance where Cloudflare might be blocking legitimate users is when your website is under a DDoS attack and it is filtering traffic aggressively to try to keep your server up and your website alive. It’s an example of killing a few to save the many, essentially.
Think of it like hunting. When the numbers of a certain animal exceed what natural predation and natural habitats can support, human hunting can help thin the herd. This reduces strain on the environment and brings nature back into a more self-sustaining balance. Wild, unrestrained hunting can destroy an animal population, but careful pruning of a species can keep it from driving itself extinct.
To apply the analogy to Cloudflare, think of your audience as the herd and your website as the environment. If the herd grows too large – during a DDoS – it destroys the environment, bringing your site down and killing the entire audience. You can’t make sales if no one can access your site, right? By carefully pruning out some of the traffic, Cloudflare can keep your site up and active. 99% of the traffic that is blocked is bad bot traffic, but some legitimate users may be caught in the net. Sure, you might lose a few sales, but nowhere near as many as you lose if your site is down entirely.
For the individual user, it’s certainly frustrating. How many times have you wanted to browse a site, only to be confronted with the Cloudflare block warning? The service attempts to provide a cached version of the site to browse while blocking is going on, but in my experience it rarely works. If a site protected by Cloudflare is under attack, and you can’t access it right away, it’s probably not going to happen until the attack is over.
The second case where Cloudflare might be blocking legitimate users and legitimate traffic is if those users or that traffic comes from a tool rather than the user directly. I have two specific examples here.
The first example is RSS. RSS readers check your site – specifically your RSS feed – on a regular schedule. Some might be hourly, some once a day, some on other schedules, but they all work the same. Any user who wants to keep up with your blog via RSS is going to be using an RSS reader of some kind to access it.
The problem is when Cloudflare decides that the RSS traffic is too bot-like and blocks it. You run into cases where the RSS reader never picks up new RSS data, for one reason or another. It’s common enough problem that people have gone out of their way to try to code bypasses and deal with the issue in various ways. Cloudflare is, of course, difficult to bypass for good reason.
How many people reading your RSS are going to convert into customers? It’s hard to say. Many of them might already be customers, looking to keep up with your blog. Others might eventually convert, but probably not from the RSS directly. Unless you can measure those specific numbers, it’s impossible to tell if you’re losing customers directly.
Another example is TOR. TOR is The Onion Router, a set of layered proxy connections that anonymize web traffic through a hazy cloud of redirects, as a way of adding security to web browsing. Your ISP can’t track your activities if they can’t tie your activities to your computer, right?
TOR has its issues, and I’m not going to dig into them right now. Suffice it to say that in some cases it’s not as anonymous as it claims to be. However, TOR is also often used for less legitimate traffic. There are two issues with this, and they both come down to how Cloudflare blocks traffic.
Cloudflare blocks traffic based on user agent and IP. TOR gives your traffic the information of whatever end point node you happen to be connecting through for that given session. Your traffic can come through a node that has been blocked for past abuses. Alternatively, you disable TOR, but your computer has been used as a node when you had it enabled, and your information is blocked.
Cloudflare has attempted to lessen the impact of using TOR, but there’s only so much they can do. If they lift all the blocking and filtering options, that tool will simply be used against them, and it makes it much harder to block specific bots.
Again, though, sacrificing a few users of specific tools in favor of keeping your site alive in the event of malicious traffic is generally a small price to pay. Unless you’re operating in a very narrow niche where 99% of your customers are users of such a tool, it won’t be that big of an impact.
Much like the case where your node in a TOR network is used as a pass-through for malicious traffic, other causes can lead to your IP being blocked. DDoS attacks are often committed through the use of a botnet, and botnets are generally composed of infected computer systems. When a computer gets the right kind of virus, it can be used as a slave to deliver traffic for a variety of reasons.
Tech-savvy users are going to keep their computers generally free of viruses and won’t likely be part of such a botnet. Less savvy users, however, abound. In fact, many of your customers – unless you’re in a high-tech, complicated and expert-level niche – will be less savvy users. The stereotype of your grandma’s computer needing disinfecting every family holiday is a real problem. If your grandma wants to make a purchase, she might find her computer blocked because a virus on it has been part of a DDoS botnet in the past.
It doesn’t even need to be a personal computer anymore, these days. Many botnets today are actually made up of various Internet of Things devices. Anything from a thermostat to a light bulb to a coffee maker can be infected and slaved to a botnet. After all, who thinks about updating the firmware on their light bulbs, or even considers the possibility of a smart bulb to visit a website?
Cloudflare might block the light bulb – which is one hell of a cyberpunk sentence to write, let me tell you – but they might also just block the IP the infected device is using. That will have the side effect of blocking your computer, even if your computer did nothing wrong. Grandma might not be using smart light bulbs, but a sufficiently smart thermostat or a coffee maker or a smart TV might be more plausible.
Again, this is something Cloudflare is working on preventing. This is where user agent blocking comes in handy. They can block devices that aren’t using typical browser user agents, but then, a corrupted device can spoof a user agent too. It’s all a complex and frankly obnoxious arms race. Plus, you don’t want to block those devices if, say, you’re the one providing the firmware updates.
Benefits Worth the Risk
With the chances of blocking legitimate traffic – and the suspicion that Cloudflare can hurt SEO in a few configurations – is it worthwhile to keep using the service? I’m of mixed opinion.
On the one hand, a DDoS is obviously a bad thing. If you’re under attack, using something to filter traffic and keep your server alive is a good idea. Site downtime is infinitely worse than a few blocked users, especially when you have no way of knowing whether or not those users are actually interested in becoming customers. Cloudflare isn’t the only game in town, but since they’re one of the most visible brands, they take a lot of flak for their competitors.
On the other hand, Cloudflare’s services can be beneficial in other cases. One of their primary uses is as a content delivery network, and using a CDN can be a benefit to SEO. Faster load times are beneficial to search ranking, and a CDN can speed up a slow loading site if your server can’t handle media as quickly as the CDN can.
At the same time, there are other CDNs. If you want to benefit from a CDN but don’t feel the need for DDoS protection, you can invest in something like Amazon’s CDN or Akamai. Of course, that can leave you high and dry if a DDoS does come knocking, so you need some kind of disaster management and recovery plan.
In general, I would say that using a CDN is always going to be a benefit, though you should focus on using one that servers your local region. Cloudflare users have had issues where the IP of the CDN-served content came from outside the usual region and thus took an SEO hit.
Using Cloudflare specifically can be a good idea if you believe your site is at risk of a DDoS attack, though there are other DDoS protection apps available if you don’t trust a tainted name.
If nothing else, if your business is not in a precarious position, you can test Cloudflare with little repercussion. Make an announcement on your site and on your social media that you will be testing Cloudflare, and ask that any users who experience issues come forward and help you troubleshoot. If you do find some issues, you can disable Cloudflare and cancel the experiment. If, on the other hand, you have no issues, Cloudflare can work just fine for your business.
The post Can Cloudflare Hurt Your Sales or Turn Away Customers? appeared first on Growtraffic Blog.
Welcome to this week’s edition of the Social Media Marketing Talk Show, a news show for marketers who want to stay on the leading edge of social media. On this week’s Social Media Marketing Talk Show, we explore Facebook rolling out a business page redesign on mobile with Andrea Vahl, Facebook bringing mentorship to Facebook [...]
The post New Facebook Business Page Layout for Mobile appeared first on Social Media Examiner.
We’re used to seeing the word ‘cloud’ stuck in front of basically every technological term out there. And while tech-savvy individuals have a decent grasp of what ‘the cloud’ is, the same can’t be said for all its potential applications. Cloud hosting, for example, which is the alternative to running websites on shared hosting or dedicated servers.
Websites are of course hosted, or stored, on servers. But organisations and individuals are often faced with the question: what type of server is the best fit? Let’s look briefly at the three main options:
• Shared hosting
• Dedicated server hosting
• Cloud or virtual private server (VPS) hosting
– You Pays Your Money…
Shared hosting is by far the most common option for small businesses and individuals, consisting as it does of many websites hosted on a single server, and offering extremely good value for money as a result. A website on shared hosting can handle up to 30,000 visitors per month, which is all that most sites need. Shared hosting is also very simple to set up, making it ideal for the beginner or non-technical user, and packages usually come with unlimited bandwidth.
Dedicated server hosting on the other hand, is a single server hosting the website(s) or application(s) of a single user. The advantage of having a dedicated server is that the entire server is focused on optimum performance, all the time. While dedicated hosting can be expensive, its huge amount of processing power means that it’s worth the cost if your website requires very fast page-load times, a dedicated IP, and the ability to handle a lot of traffic – as many as 100,000 visitors per month, for example. Dedicated servers are very secure and often offer several IPs for multiple services that need to be kept separate.
Cloud hosting, also known as virtual private server (VPS) hosting, is probably the most difficult to describe out of the three types.
Imagine a computer with thousands of processors, terabytes of RAM, and unlimited hard drive space. Then imagine that, for an hourly fee, you could access as much or as little of those resources as you needed at short notice. In a sense, it’s the best of both worlds: a huge amount of computing resources, similar to that of a single dedicated server, but for an affordable price comparable to that of shared hosting – and with the benefit of scalability and flexibility thrown in.
With more customisation options than shared hosting, VPS hosting is also good for the more technically inclined, and usually caters to programmers and web designers.
– Bare Metal Servers
But there’s one other option that we haven’t mentioned so far: bare metal server hosting. This is a relatively recent development that offers a hybrid solution, providing performance and cost-effectiveness by combining the best bits of both dedicated hardware and cloud technology. Bare metal servers are not completely new – they’re more like a reinvention of dedicated servers – but they differ from dedicated servers in how they integrate with cloud-based technologies to offer increased flexibility and cost control.
Bare metal servers are ‘physical’ servers, not virtual, and they’re also ‘single tenant’, meaning that each one belongs to a single customer. While each server may run any amount of work for the customer, or may have multiple simultaneous users, a bare metal server is nevertheless dedicated entirely to the one customer – either an organisation or individual – who rents it. Unlike many servers in a data centre, these machines are not shared between multiple customers.
Bare metal servers are designed to deal with significant, but short-term, processing needs. Data can be stored, processed or analysed on a server for as long as is necessary, and then the server can be wound back down when it’s no longer needed. This way, resources aren’t wasted, and there’s no need to continue running the server for longer than necessary.
The contrast with VPS hosting or cloud servers is that in a typical cloud server infrastructure, there could be dozens of virtual machines running on the same physical server, each with its own processing requirements. Bare metal servers are single-tenant, so resources are dedicated to only one user who can count on guaranteed performance.
– Bare Metal = No Hypervisor = Best Performance
Bare metal servers offer higher performance by eliminating the hypervisor layer (the virtual machine monitor which creates and runs VMs and manages the execution of the guest operating systems). Running the hypervisor is a drain on resources which inevitably leads to a degradation in performance on cloud servers. However, there is no hypervisor layer on bare metal servers (because they are dedicated, physical machines), so this overhead, and the resulting performance hit, is eliminated.
From a technical perspective, a bare metal server is basically the same as a dedicated server – one that offers high-performance resources that are dedicated to one user – but with the advantage of flexible, pay-as-you-use billing and no contracts.
– It’s All About the Hybrid
Bare metal servers really come into their own when they’re combined with a more traditional cloud infrastructure. If you already have a cluster of virtual machines hosting your website for example, you can link your bare metal server to your VMs and have them work together.
High-performance bare metal servers are ideal for situations where companies need to perform short-term, data-intensive functions without any kind of overhead performance penalties, such as big data processing. Previously, organisations couldn’t put these workloads into the cloud without accepting lower performance levels, but with bare metal servers that risk is eliminated.
To summarise its selling point, the bare metal/cloud hybrid solution provides a way to complement or substitute virtualised cloud services with a dedicated server environment that eliminates the hypervisor overhead, but without sacrificing flexibility, scalability and efficiency.
LANSING, Mich. – Liquid Web, LLC, the market leader in managed hosting and managed application services to SMBs and web professionals, has announced the launch of their Protection and Remediation Services, to further help safeguard customers from cyber-attacks and to address many standard compliance requirements.
“Security continues to be top-of-mind to hosting customers looking to protect their cloud footprint. Most organizations, especially in the SMB market, do not have the expertise or resources to implement and manage the technologies that can provide even a minimal level of security,” said Chief Technology Officer, Joe Oesterling. “Our new Protection and Remediation packages are purpose built to provide security and compliance measures for our customers’ website and applications workloads as well as their server instances,” said Oesterling.
Industry studies continue to show that it can take 200 days (or more) for a company to identify a security breach. Most organizations are comprised for months before a breach is noticed, meaning that malicious activity has already done significant damage by the time it’s detected. Liquid Web’s new Protection and Remediation services will serve a dual purpose by preventing attacks on the server and application level, and by addressing many standard compliance requirements, such as web application firewall, antivirus and vulnerability scanning.
“Our goal is to offer a set of preventative security tools that protect customers from a variety of cyber-attacks and help them clean up their systems when malware does find its way into their systems,” said Jason Wolford, Product Manager.
To learn more about Liquid Webs Protection and Remediation services, visit https://www.liquidweb.com/products/add-ons/security/protection-remediation-services/.
About Liquid Web
Liquid Web powers content, commerce, and potential for SMB entrepreneurs and the designers, developers and digital agencies who create for them. An industry leader in managed hosting and cloud services, Liquid Web is known for its high-performance services and exceptional customer support. The company owns and manages its own core data centers, providing a diverse range of offerings spanning from bare metal servers and fully managed hosting to Managed WordPress and Managed WooCommerce Hosting. As an industry leader in customer service*, Liquid Web has been recognized among INC Magazine’s 5000 Fastest Growing Companies for ten years. Liquid Web is part of the Madison Dearborn Partners family of companies.
SOUTHFIELD, MICH. – Nexcess, a leading provider of performance-optimized WordPress and WooCommerce hosting, this month celebrates the one-year anniversary of the introduction of managed WooCommerce hosting to its global eCommerce platform.
Last year, WooCommerce joined Nexcess’ line-up of premium managed application hosting plans, which also includes Magento, WordPress, CraftCMS, and OroCRM. Site performance is critically important to eCommerce revenues, and WooCommerce retailers on the Nexcess platform benefit from the company’s commitment to low-latency hosting, which is supported by an environment specifically engineered for WooCommerce.
In the year since Nexcess introduced WooCommerce hosting, the company’s platform has evolved. The brand-new Nexcess Cloud provides on-demand WooCommerce instances with automatic scaling, comprehensive performance optimization, and a range of developer friendly features.
“The reaction to our WooCommerce hosting platform has been amazing, and we’re proud of the growth we have experienced in just a year,“ commented Chris Wells, President and CEO of Nexcess, “With the introduction of WooCommerce cloud hosting earlier this year, we have seen even more interest from retailers who plan to use our platform to increase the performance and reliability of their eCommerce store and to offer an enhanced retail experience to shoppers.”
Nexcess is among the most experienced eCommerce hosting providers in the world, with almost two decades as a trusted partner of retailers of all sizes. Nexcess’ shared, dedicated, and custom clustered hosting solutions are trusted by thousands of retailers, who rely on Nexcess’ stability and expert support.
Nexcess provides WooCommerce and Magento hosting for eCommerce retailers of all sizes, from solo retailers and small businesses to the largest enterprise stores. With data center locations in the USA, Australia, and Europe, WooCommerce stores hosted on Nexcess benefit from low-latency connectivity to the most important international eCommerce markets.
Nexcess is a Southfield, Michigan-based eCommerce hosting company founded in 2000, with data centers distributed throughout the United States, Europe, and Australia. Nexcess offers a variety of cloud, dedicated, and clustered eCommerce hosting services for Magento and WooCommerce, with an emphasis on achieving maximum performance for high-traffic sites.
SOUTHFIELD, MICH. – Future Hosting, a managed VPS and dedicated hosting provider, has warned server hosting clients of the dangers posed by insecure Memcached instances. When configured incorrectly Memcached, a popular caching application, can be used by bad actors to launch massive Distributed Denial of Service attacks (as reported in CSO Online).
Memcached is used by millions of websites around the world. It is a key-value database that caches the results of database queries to accelerate the performance of web applications. Memcached can be configured to accept connections from arbitrary hosts on the open web. Bad actors can use insecure Memcached instances to launch amplified, reflected DDoS attacks against their victims, taking their websites and applications offline.
Memcached is one of many applications that can be used to amplify the bandwidth available to an attacker: open DNS servers and NTP servers are also common vectors. But Memcached is significantly more potent. It can be used to amplify the data in a DDoS attack by a factor of more than 50,000.
“Future Hosting provides server hosting for thousands of businesses, and we’re concerned that insecure Memcached instances pose a serious threat to our clients and other businesses on the web,“ said Maulesh Patel, VP of Operations of Future Hosting, “Memcached is ubiquitous on the modern web because of its usefulness, but less experienced system administrators are not configuring it securely, providing bad actors with a DDoS vector that threatens even the largest online businesses.”
Earlier this year, a popular version control platform was targeted by a record-breaking DDoS attack that peaked at 1.35 TB per second. Soon after, that record was broken by a DDoS attack that used insecure Memcached instances to send 1.7 TB per second to its victim. Few businesses can mitigate attacks of this magnitude.
Future Hosting urges server administrators to ensure that Memcached instances hosted on their servers are configured securely. Memcached should never be reachable from the open internet or configured to respond to requests from arbitrary hosts.
Developers and system administrators without the expertise to securely configure server software should consider hiring a professional system administrator or a managed server hosting provider that can configure a secure hosting environment.
About Future Hosting, LLC
Founded in 2001, Future Hosting is a privately held leading Internet solutions provider specializing in managed hosting, including Dedicated Servers, Virtual Private Servers, and Hybrid Virtual Private Servers. The company has built a strong reputation for its high-quality service, innovative pricing models, and 3-hour Service Level Agreement. Future Hosting is based in Southfield, Michigan.
IRVINE, CA – RapidScale, a leader in managed cloud services, has announced that Veeam® Software, the leader in Intelligent Data Management for the Hyper‑Available Enterprise™, will be RapidScale’s leading data protection and availability solution.
Together, RapidScale and Veeam empower customers with a reliable and robust cloud-based backup solution. The platform gives customers a single pane of glass to their entire environment which helps organizations mitigate their cloud risk and data mobility cost challenges while maintaining data control across on-premises and cloud resources.
RapidScale has integrated this offering into its easy-to-use interface with a powerful functionality making a data backup and storage plan simple, reliable and affordable. This gives customers the ability to have a complete view of their backup from not only a virtual setting but a physical environment as well. With Veeam having the No. 1 set of capabilities for backup and availability for virtualized environments, customers will now be able to automatically anticipate needs and meet demand and move securely across multi-cloud infrastructures.
“We are pleased that RapidScale has joined our VCSP (Veeam Cloud & Service Provider) program,” said Matt Kalmenson, vice president of service and cloud provider sales at Veeam. “Our Availability solutions, combined with their service delivery and knowledge of Veeam products, should enable them to acquire even more clients, resulting in additional growth of their overall profitability and value. We are looking forward to working together in an effort to solve the key challenges of our joint customers.”
RapidScale SVP, Technology Duane Barnes says “As RapidScale continues to scale and move up the market, an advanced software-defined backup solution was critical to our success. After reviewing our requirements and customer demand, Veeam became the obvious choice because of its simplicity to deploy and protect mission-critical data.”
RapidScale, a managed cloud services provider, delivers world-class, secure, and reliable cloud computing solutions to companies of all sizes across the globe. Its state-of-the-art managed CloudDesktop platform and market-leading cloud solutions are the reasons why RapidScale is the provider of choice for leading telecommunications providers, VARs, MSPs, and agents throughout the United States. RapidScale is not only delivering a service but also innovating advanced solutions and applications for cloud computing. RapidScale’s innovative solutions include CloudServer, CloudDesktop, CloudOffice, CloudMail, CloudRecovery, CloudApps, and more. For more information on RapidScale, visit www.rapidscale.net.
In this special edition of the Social Media Marketing podcast, I reveal four lessons I have picked up from 6 years of podcasting (and growing Social Media Examiner). The topics I’ll cover include how to grow anything, how to succeed via omission, how to achieve thought leadership, and my view on competition. I’ll also share [...]
The post How to Grow: Wisdom From 6 Years of Podcasting appeared first on Social Media Examiner.
Do others manage your Facebook ads? Wondering how to get others to securely share account access to advertising assets? In this article, you’ll discover how to provide sharing access to Facebook ads, Google Analytics, and lead page assets. How Client Control Protects All Parties When creating digital marketing funnels, you deal with an array of [...]
The post How to Safely Share Access to Your Facebook Ads and Google Analytics Data appeared first on Social Media Examiner.
The voice revolution is coming. But how soon, and how much is happening now? What are the opportunities for businesses? This guide to voice answers those questions and more.
Is Voice Usage Growing?
The answer is a resounding “yes,” as shown by this chart shared by Bing’s Purna Virji at Pubcon Fort Lauderdale in April 2018:
Clearly, the growth in usage is rapid, but what’s missing from the data is anything that shows the absolute value of that growth. We also have the prediction from Andrew Ng, Chief Scientist of Baidu, who suggests that 50% of all searches will be by either voice or images by 2020 (just two years away!).
And there’s this data from comScore:
Note that none of this data shows the actual percentage of total searches performed by voice (Andrew Ng’s forecast was just a prediction). So where are we then? These three points may temper our excitement:
I use both Siri and Google Assistant on my iPhone regularly. I have multiple Amazon Echo and Google Home devices at home. I often need to repeat commands to get them recognized. For example, I might come into my kitchen and say, “Hey Google, turn on the kitchen lights,” and need to repeat it three times to get it to execute that command. The same type of thing happens with Alexa. The point is that there are still real issues with the speech recognition side of things.
Many factors impact the quality of speech recognition, such as the wide range of types of voices (not to mention different languages, but let’s not add that complication for now) and the background noise in the environment where the device is trying to listen to a human speaker.
Humans solve these problems with relative ease. The machines are still developing a similar level of capability. They’re going to need to make a ton of progress on this front before voice use becomes ubiquitous, and this may take years to unfold.
In other words, I disagree with Andrew Ng’s projection. We won’t be at 50% in 2020. That said, this revolution IS coming, and you need to get ready for it. It may be many years away before it becomes more popular than typing in search queries, but the time is coming. And remember, the majority of voice usage may not be in the form of voice search queries at all.
Voice Search. Or Is It?
As an industry, everyone seems to refer to what’s unfolding as “voice search.” That refers to replacing the traditional search box where we type in a query with spoken search commands. At this level of definition, this would not include commands like “Call Mom.”
I also wouldn’t consider using voice to text to message a friend as a voice search. Or setting a timer or alarm on your phone. Yet, personal assistants like Google Assistant, Alexa and Siri do consider these examples of voice usage.
I think we should be tracking the total landscape of voice interactions between humans and devices. Searching for answers to questions or navigating to websites remains a big part of what personal assistants will do for us, but it’s much bigger than that. So, voice search is a poor label for that. What should we use instead? How about “voice interactions?”
Why voice interactions is a better term than voice search for the emerging voice-activated device revolution.Click To Tweet
What’s Driving the Growth of Voice Interactions?
One of the major drivers is the growth of installed Internet of Things (IoT) devices. IoT refers to the different types of devices that are connected to the internet, including watches, cars, smart TVs, smart refrigerators, and more. Just how far has this come? According to IHS Markit, there were more than 27 billion IoT devices by the end of 2017:
Notice the base estimate of 1.7 billion PCs. While IHS Markit does not separate out the number of smartphones and tablets, Seeking Alpha estimates smartphones at 5 billion, and Statista estimates tablets at 1 billion. These three categories add up to 7.7 billion. That tells us that as of 2017, 72% of all the internet-connected devices are something other than a PC, smartphone or a tablet.
That means that 72% of the connected devices out there likely don’t have a search box or a traditional browser to work with. So how will users communicate with those devices? Ah yes—voice interaction seems like a great answer to that question! Important note: Of course, many of the IoT devices may not have any human interaction at all, but nonetheless, that category is seeing amazing growth as well.
We have data from Apple indicating that Siri is actively used on 500 million devices and data from Google that Google Assistant is available on more than 400 million devices.
We also see the rise of usage of smart speakers. According to Juniper Research, these will be in 55% of U.S. households by 2022.
Personal assistants tend to have a very high level of voice interactions, and smart speakers (which are a sub class of personal assistants) are designed to be voice-centric. These devices will help drive voice adoption rapidly higher.
Other types of IoT devices that users will interact with are seeing very high growth as well. For example, smart thermostat sales are predicted to reach 14 million in the U.S. and Canada in 2021 and reach around 12 million in Europe in the mid-2020s. And, smart appliances, such as refrigerators, are seeing explosive growth as well:
Source: Market Research Future
We are seeing a major trend of users growing more comfortable using voice with their devices in public. At Stone Temple (now Perficient Digital!), we did a survey of voice usage by more than 1,000 users in both 2017 and 2018. Here is what a year-over-year comparison showed for voice usage in different environments:
What Makes Voice Search Different?
I am going back to the term “voice search” on purpose for this section. There is a class of voice interactions that is clearly voice search. For example, a user might ask their Alexa device, “How old is Barack Obama?” and they’ll get an answer. Where do these responses come from? There are three sources:
Public domain information
Licensed databases of information
What is a Featured Snippet?
A featured snippet is information extracted from a third-party website and presented in the search results above the rest of the organic search results. Featured snippets are identifiable because they include a link to the source page from which the information was drawn. Featured snippets are available as a data source for search engines like Google and Bing, but not to Amazon or Apple for their personal assistants. Here is an example of a featured snippet:
What is Public Domain Information?
This is information that is generally known to the public and is not possible to copyright. An example of public domain information is the fact that the capital of Texas is Austin. One large source of public domain information is the U.S. government, which pushes a lot of census data into the public domain, as well as data from other sources. All the players in the voice interactions arena have built their own databases of public domain information and continue to enhance those on a regular basis.
What are Licensed Databases of Information?
These are databases of information from businesses that collect such information and license that data to third parties. An example of this is song lyrics. Lyrics are subject to copyright law, and even if you know what they are, you can’t republish them without permission. In Google, if you search for “walk this way lyrics,” you will get the complete lyrics presented above the regular search results because Google has licensing deals in place for the lyrics to most popular songs (including “Walk This Way”). Players like Amazon and Apple need to get all their non-public domain answers via some form of licensing.
The Opportunity with Featured Snippets
Featured snippets are used by Google and Bing in response to regular (e.g. typed) search queries and get top-tier positioning in the search results, so it’s essential to learn how to get your site selected as a source for featured snippets. But they also play a large role in the world of voice search.
In a voice search world, you get one answer in response to your query. Research has shown that a large percentage of the responses to voice search queries comes in the form of a featured snippet. Therefore, if you’re not the source of that one answer, you get no visibility at all. Literally none. As a result, participation in the voice search ecosystem demands that you develop skills for obtaining featured snippets. That will help drive growth for your traditional organic search traffic as well.
You can learn more about how to get featured snippets in this eight-step guide.
The Role of Personal Assistants
It’s natural to associate your personal assistant with the device it’s on—Alexa with your Amazon Echo, Google Assistant with your smartphone or Google Home device and so forth. However, these personal assistants reside in the cloud, and there is no need for them to have any tie to a specific device at all. You can view the relationship between you, the devices, and your personal assistant like this:
An example of how this might work could be: You’ll be able to start a conversation with your cloud-based personal assistant through your TV, go to the kitchen and continue the dialog through your refrigerator, and then as you get into your car to go to the store, finish the conversation there. Reinforcing this flexibility is the fact that the personal assistant will recognize your voice, so there will be no need to “log in” to start or continue the dialogue.
As we saw in the data from Bing, one of the most popular uses of personal assistants today is to ask informational questions. This gives rise to the question of which one is smartest. In a study conducted here at Stone Temple (now Perficient Digital), we found that Google Assistant is the smartest, but Amazon’s Alexa and Bing’s Cortana are not far behind.
Another major aspect about personal assistants is that they will be developed over time to address nearly all your online needs. Want to book a trip to the Caribbean for a winter vacation? You’ll be able to use your personal assistant. Want to program a schedule for all your lights, heating and cooling systems, and other aspects of your house? You’ll be able to do that in your personal assistant too. And, of course, buying stuff online, getting answers to questions and doing research will all be part of the package.
The good news for businesses is that companies like Google, Amazon, Apple and Microsoft will require third parties to develop apps to run within their platforms. This is where opportunities lie for the rest of us to benefit from the rise of these platforms. We’ll examine that next.
Building a Personal Assistant App
Many companies are already building Alexa Skills and Actions on Google Apps. These apps exist in the Alexa and Google Assistant ecosystems and can be provided to users who know about them. In the Alexa environment, an explicit act of installing the Skill must be done by the user before it can be used. In the Google Assistant environment, the app can be invoked one of two ways:
The user invokes the app by name, in a query like: “Ask Stone Temple: what is a NoIndex tag.” Note that no prior step of installing the Stone Temple Action is required.
The user asks a query that the Action has an answer for but does not explicitly invoke the Action. However, Google decides to suggest your Action to the user. These are referred to as Implicit Queries, and they represent one of the exciting opportunities with Actions on Google Apps. Basically, Google is giving you brand exposure to users who did not know about your Action app.
The basic concept of building a Skill or an Action can be quite simple. For example, you can develop a Skill/Action that answers simple questions for users. The Stone Temple Skill/Action answers questions related to SEO and digital marketing. As shown above, you can ask it questions like, “What is a noindex tag?” or “What does a nofollow tag do?” and it will respond with the answer.
Try out the Stone Temple Skill/Action apps!
The place to start is with Dialogflow.com. This website is from Google and will help you build an Action app. However, this very same site can also be used to output apps for many other platforms such as Facebook Messenger, Slack, Viber and Twitter. It also outputs JSON code that can easily be modified to create an Alexa Skill.
The key thing to understand is that you will need to build a very detailed language model. That means for a question like, “What is a NoIndex tag?” you need to specify the phrase variations you will support. A few examples of these are:
What does a NoIndex tag do?
Tell me what a NoIndex tag does.
How does the NoIndex tag work?
What is it that a NoIndex tag accomplishes?
Unless you specify the major phrase variants, neither Alexa nor the Google Assistant can process the phrases mentioned above. This can be a lot of work!
Once you’ve developed the App, you can run it in test mode before submitting it for certification. This is true for both Alexa and Google Assistant.
Personas in Voice Apps
One last important thing, and one that many brands are overlooking, is the role of personas. You can, should, and must project your brand persona via your voice app.
At the heart of the reason why is because humans instinctively attach a personality to a voice, be it from a human or a machine. In fact, a Stanford University study by Eun-Ju Lee, Clifford Nass and Scott Brave showed that computer-generated speech can have gender. To prove this, the study created a basic dilemma:
Amy and John are roommates in an apartment and are not romantically involved. However, Amy’s parents think that she’s rooming with another woman, and now they’re coming out for a visit. So, the question posed in the study was, “Should Amy ask John to move out during the visit?” In the study, an equal number of men and women were split into four groups, and each group had a computer voice make an argument for how to resolve this, either by arguing that Amy should ask John to move out, or that John should stay. Further, two different voices were used, one female voice and one male voice.
The net result is that eight tests were conducted as follows:
One group of women heard an argument by a male voice that John should move out.
One group of women heard an argument by a male voice that John should stay.
One group of women heard an argument by a female voice that John should move out.
One group of women heard an argument by a female voice that John should stay.
One group of men heard an argument by a male voice that John should move out.
One group of men heard an argument by a male voice that John should stay.
One group of men heard an argument by a female voice that John should move out.
One group of men heard an argument by a female voice that John should stay.
All the groups were of equal size. The study showed that both men and women were more likely to accept the argument made by a voice in their own gender.
Another study by Clifford Nass and Kwan Min Lee showed that computers can have personality as well. In fact, people will recognize a computer’s voice as introverted or extroverted, and they’re more likely to like a voice similar to their own personality type.
From a voice app perspective, understanding these dynamics is incredibly important. Your brand should have its own persona, and it should match up with the preferences of the target audience for your business. Using the default voices provided by Alexa or the Google Assistant is just not going to cut it.
You’ll need to develop a persona strategy, find the right voice talent and invest in developing a persona that will best serve your brand’s interests.
For more information on voice personas and persona design, check out the presentation that Duane Forrester and I jointly did at SMX Advanced.
Is your blog slow to load? Wondering if your site’s speed is affecting search traffic? In this article, you’ll discover how to assess and improve the page speed and mobile-friendliness of your site. Why Page Speed Matters to Social Media Marketers and Bloggers One of the biggest takeaways from the 2018 Social Media Marketing Industry [...]
The post How to Improve the Page Speed of Your Blog appeared first on Social Media Examiner.
Are you using a Facebook video ad campaign to build awareness of your business and products? Wondering how to stay top of mind with a warm audience? In this article, you’ll learn how to set up a sequence of Facebook video ads to keep warm Facebook audiences engaged with your business. #1: Identify High-Performing Facebook [...]
The post How to Engage a Warm Audience With Facebook Video Ad Sequences appeared first on Social Media Examiner.
Ready to create your first Facebook ad?
Have a new business, product or service you want to promote?
Do you use Facebook Events to promote your event? Wondering how to use ads with your Facebook event? In this article, you’ll discover how to drive ticket sales to your event using Facebook ads. #1: Set Up Ticketing for Your Facebook Event You want to make it as easy as possible for people to buy [...]
The post How to Use Facebook Ads to Sell Tickets to Your Facebook Event appeared first on Social Media Examiner.
Are you gambling all your ad spend on the big players in online advertising? Maybe it’s time to hedge your bets and diversify your spending.
In this episode of our popular Here’s Why digital marketing video series, Stone Temple’s Mark Traphagen reveals that the long-standing hegemony of Google and Facebook for online advertising may be coming to an end, and what smart marketers will do about it.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
Data Suggests Surprising Shift: Duopoly Not All Powerful by eMarketer
See all of our Here’s Why Videos | Subscribe to our YouTube Channel
Eric: So Mark, you were telling me the other day that there are signs of disruption in the online advertising landscape. Share with our viewers what’s up.
Mark: I’d be happy to, Eric. For a long time now, the online advertising world has been dominated by just two players, Facebook and Google, but we’re seeing the first signs that their hegemony may have a termination date.
Eric: And what are those signs?
Mark: According to data from eMarketer, the combined advertising market share of Google and Facebook has been slowly declining since 2016.
But while Facebook’s share showed some growth since then, it appears to have leveled off, while Google’s share has declined slightly for each of the past two years. Now they project that Facebook has peaked while Google will probably continue a gradual decline over the next few years.
Eric: But they still represent the lion’s share of the market, right?
Mark: Oh, for sure. Together they’re projected to hold about 57% of the market for 2018, but that’s down from almost 59% last year. And digital advertising spend is up, so they’re still quite profitable.
But, the bigger shocker is the even steeper decline in new ad spend for those two giants. According to eMarketer, the two networks will get just 48% of the new ad spend this year, and that’s down from nearly 73% last year.
Facebook's and Google's share of new ad spend is declining. Find out the implications for advertisers.Click To Tweet
The other big story here is who is making a dent in that spend. News isn’t all bad for Facebook Corp. as their Instagram subsidiary is projected to pull in about $5.48 billion this year. That’s just 5% of the U.S. digital ad market but it’s growing while Facebook’s share is stagnating.
Eric: What do you think lies behind all this?
Mark: Probably a number of reasons, but one of the most probable is ad inventory. The demand for Facebook’s advertising was so high that by last year Facebook admitted they’ve started running out of available inventory.
Eric: Wait, how can a digital platform run out of ad inventory?
Mark: They know they can only put so many ads into users’ feeds before those users become unhappy and stop using Facebook, so they have to limit the number of available ad slots, and that means the cost per 1000 impressions, the cost per result, all those things have been going up for advertisers.
Now, at the same time, Instagram’s user base is growing by leaps and bounds, so there is still plenty of inventory over there, plus some advertisers report better results from their paid Instagram efforts.
Eric: So who else is taking a bite out of Facebook’s and Google’s market share?
Mark: Well, one player few suspected to be a player is Amazon. Now, we usually think of Amazon as an online retailer, which of course is what they primarily are, but they are also showing interesting gains from advertising revenue as independent sellers clamor to get access to Amazon’s huge ready-to-purchase audience.
While it’s currently number five in online revenue, Monica Peart of eMarketer believes they’re poised to be number three by 2020, surpassing Microsoft at that point. And despite all of Snapchat’s growth and stock price woes, they’re projected to break a billion dollars in ad sales this year.
What Does Disruption in Online Advertising Mean for Marketers?
Eric: So what does all that shifting around mean for marketers?
Mark: First off, I don’t think it’s even close to being time to dump whatever Facebook or Google advertising you’re doing. Though they may have peaked, both of them will continue to be the market leaders with the biggest potential audiences for many years to come.
But I also think it’s an excellent time to explore other options where costs are still low but growth is expected. It could pay big dividends in increased ROI to be more involved on those networks.
Eric: So are you saying you should spread out your advertising budget across all the social networks?
Mark: Well, not necessarily. I mean, you do have to use some discernment about which networks make the most sense for your brand.
For example, while Snapchat is continuing to grow its audience, it’s probably heavily weighted towards certain demographics, so it may not be the best reach for your company. But if you have the budget, it’s worth at least testing almost any market to see if your assumptions are right or wrong.
Don’t miss a single episode of Here’s Why with Mark & Eric. Click the subscribe button below to be notified via email each time a new video is published.
Subscribe to Here’s Why
See all of our Here’s Why Videos | Subscribe to our YouTube Channel