Industry Buzz

How Does SSH Access Work on My Shared Hosting Plan?

InMotion Hosting Blog -

If you have a website on a shared hosting platform, then you probably want to make sure that your site stays safe and secure. These days, there’s an unfortunate likelihood that your site will come under attack at some point. One tool that you can use to help encrypt your information is SSH access, otherwise known as Secure Shell protocol. This is an encryption protocol for your network and it lets you create a secure connection across otherwise unsecured networks. Continue reading How Does SSH Access Work on My Shared Hosting Plan? at The Official InMotion Hosting Blog.

Just Write Code: Improving Developer Experience for Cloudflare Workers

CloudFlare Blog -

We’re excited to announce that starting today, Cloudflare Workers® gets a CLI, new and improved docs, multiple scripts for everyone, the ability to run applications on without bringing your own domain, and a free tier to make experimentation easier than ever. We are building the serverless platform of the future, and want you to build your application on it, today. In this post, we’ll elaborate on what a serverless platform of the future looks like, how it changes today’s paradigms, and our commitment to making building on it a great experience.Three years ago, I was interviewing with Cloudflare for a Solutions Engineering role. As a part of an interview assignment, I had to set up an origin behind Cloudflare on my own  domain. I spent my weekend, frustrated and lost in configurations, trying to figure out how to set up an EC2 instance, connect to it over IPv6, and install NGINX on Ubuntu 16.4 just so I could end up with a static site with a picture of my cat on it. I have a computer science degree, and spent my career up until that point as a software engineer — building this simple app was a horrible experience. A weekend spent coding, without worrying about servers, would have yielded a much richer application. And this is just one rung in the ladder — the first one. While the primitives have moved up the stack, the fact is, developing an application, putting it on the Internet, and growing it from MVP to a scalable, performant product all still remain distinct steps in the development process.This is what “serverless” has promised to solve. Abstract away the servers at all stages of the process, and allow developers to do what they do best: develop, without having to worry about infrastructure.And yet, with many serverless offerings today, the first thing they do is the thing that they promised you they wouldn’t — they make you think about servers. “What region would you like?” (the first question that comes to my mind: why are you forcing me to think about which customers I care more about: East Coast, or West Coast? Why can’t you solve this for me?). Or: “how much memory do you think you’ll need?” (again: why are you making this my problem?! You figure it out!).We don’t think it should work like this.I often think back to that problem I was facing myself three years ago, and that I know developers all around the world face every day. Developers should be able to just focus on the code. Someone else should deal with everything else from setting up infrastructure through making that infrastructure fast and scalable. While we’ve made some architectural decisions in building Workers that enable us to do this better than anyone else, today isn’t about expounding on them (though if you’d like to read more, here’s a great blog post detailing some of them). What today is about is really honing Workers in on the needs of developers.We want Workers to bring the dream of serverless to life —  of letting developers only worry about bugs in their code. Today marks the start of a sustained push that Cloudflare is making towards building a great developer experience with Workers. We have some exciting things to announce today — but this is just the beginning.Wrangler: the official Workers CLIWrangler, originally open sourced as the Rust CLI for Workers, has graduated into being the official Workers CLI, supporting all your Workers deployment needs. Get started now by installing Wranglernpm install -g @cloudflare/wrangler Generate your first project from our template gallerywrangler generate <name> <template> --type=["webpack", "javascript", "rust"] Wrangler will take care of webpacking your project, compiling to WebAssembly, and uploading your project to Workers, all in one simple step:wrangler publish A few of the other goodies we’re excited for you to use Wrangler for:Compile Rust, C, and C++ to WebAssemblyCreate single or multi-file JavaScript applicationsInstall NPM dependencies (we take care of webpack for you)Add KV namespaces and bindingsGet started with pre-made templatesNew and Improved DocsWe’ve updated our docs (and used Wrangler to do so!) to make it easier than ever for you to get started and deploy your first application with Workers. Check out our new tutorials:Deploy a Slack bot with WorkersBuild a QR code generatorServe and cache files from cloud storageMultiscript for AllYou asked, we listened. When we introduced Workers, we wanted to keep things as simple as possible. As a developer, you want to break up your code into logical components. Rather than having a single monolithic script, we want to allow you to deploy your code in a way that makes sense to you. no-domain-required.workers.devWriting software is a creative process: a new project means creating something out of nothing. You may not entirely know what exactly it’s going to be yet, let alone what to name it.We are changing the way you get started on Workers, by allowing you to deploy to You may have heard about this announcement back in February, and we’re excited to deliver. For those of you who pre-registered, your subdomains will be waiting for you upon signing up and clicking into Workers. A Free Tier to ExperimentGreat products don’t always come from great ideas, they often come from freedom to tinker. When tinkering comes at a price, even if it’s $5, we realized we were limiting peoples’ ability to experiment.Starting today, we are announcing a free tier for Workers. The free tier will allow you to use Workers at up to 100,000 requests per day, on your own domain or You can learn more about the limits here.New and improved UIWe have packaged this up into a clean, and easy experience that allows you to go from sign up to a deployed Worker in less than 2 minutes:Our commitmentWe have a long way to go. This is not about crossing developer experience off our list, rather, about emphasizing our commitment to it. As our co-founder, Michelle likes to say, “we’re only getting started”. There’s a lot here, and there’s a lot more to come. Join us over at to find out more, and if you’re ready to give it a spin, you can sign up there.We’re excited to see what you build!

Is Managed VPS Like Having a Server Administrator?

InMotion Hosting Blog -

One of the top choices for website hosting options today is VPS hosting. And with that, many people are wondering whether they should also consider adding a managed option to their hosting plan. As with everything in life, this question isn’t a strictly “Yes” or “No” type of query but has several factors that need to be taken into account. For many people, the benefits of having a managed VPS will definitely outweigh the monetary side of things. Continue reading Is Managed VPS Like Having a Server Administrator? at The Official InMotion Hosting Blog.

How to Make More Money With 5 WooCommerce SEO Tactics

Liquid Web Official Blog -

Search engine optimization (SEO) can be a powerful way to grow your WooCommerce site. But it can also feel overwhelming. We’ll cover five WooCommerce SEO tactics that can make it a little easier. SEO Basics SEO is the art and science of getting your website found in the free or organic search space of Google, Yahoo, and Bing. Why does that matter? Because that’s how people find stuff to buy: 93% of online experiences begin with a search engine. 43% of all eCommerce traffic comes from organic Google searches. You should care about SEO because it’s a simple way to increase sales. Rank higher in the search engines, more people will find your site, more people will buy your stuff. Watch the Webinar This valuable SEO knowledge comes from a webinar presentation about managing SEO for WooCommerce stores with Lindsay Halsey from Pathfinder SEO and Chris Lema, Liquid Web’s VP of Products and Innovation. You can watch the full webinar or read our summary below. How to Approach SEO Just because SEO sounds pretty simple, doesn’t mean it’s easy to accomplish. A lot of people quickly feel overwhelmed by the amount of work and the changing technical landscape. You need to approach WooCommerce SEO like climbing a mountain. It can be an overwhelming goal, but you need to break it down into incremental steps. Ask yourself what you can accomplish in the next hour to get closer to effective SEO. Improving SEO will take some work, but you can get there if you: Plan and process: You’ve got to have a plan and process to be effective. You can’t take a random or scattershot approach. Learn from others: You’re not in this alone. See what others are doing and make adjustments. Pick reasonable but lofty goals: Aim high. Not so high that you’ll never get there, but make some serious goals. Put one foot in front of the other: You have to put in the work, but take it one step at a time. 5 WooCommerce SEO Tactics We’ve got five tactics to help improve any store’s search engine rank: 1. Start With Qualitative Keyword Research Keywords are the words and phrases people enter into search engines. They’re also known as search queries. Keyword research is the process you’ll use to identify the phrases your customers actually use. This is important because it’s foundational to everything you do with SEO. Too often store owners dive into SEO without doing any qualitative keyword research. Too often we think we know our audience, so we plow forward without doing the research. But it’s important to stay in touch with your audience and know the exact language they’re using. It may surprise you. There is a six-step process that can help with keyword research: Understand your audience: Start by asking questions and listening. Brainstorm: Do some research to see what terms your audience is using. Quantify: How many people are actually searching for a given keyword? Organize & evaluate: Organize search terms into groups and clusters. Which ones are worth pursuing? Map: Align keywords to pages on your site. Repeat: Do it again and again, because the SEO space evolves, the tools get better, and your business changes. Again, store owners are often quick to skip the first two steps. It’s tempting to get right into the data and see what keywords are popular. But before you get there, you need to do that research and make sure you’re not missing some important search terms because you didn’t realize what terms people were searching. 2. Share Your Expertise Via Content Marketing Stores don’t create enough content. That’s a problem because search engines love content. Google cares about expertise, authority, and trust (EAT). Content builds expertise. So stores need to be sharing with the world why they’re an expert in their particular area. Give search engines a reason to link to your store. The average first-page result on Google is 1,890 words. So long form content wins. You need to write a lot of content to be able to share expertise and give search engines something to point to. Offer detailed, in-depth information and create ultimate guides. Helpful tip: Many store owners balk at writing, so make it easy. Use an audio recording app to get them talking. Ask a bunch of questions and they’ll have no problem sharing helpful insights. Simply transcribe those answers, and you’ve got a good start on quality content. is one service that offers transcription for $1 per minute. Another good content approach is to aim for the position zero result. This is a newer feature where Google directly highlights a result to answer a specific question. You can take your keywords and apply them to a grid framework to generate a number of questions and then answer them. When it seems hard to flesh out expertise, this can be a good way to showcase the breadth of knowledge. Use this grid framework to quickly flesh out your expertise. 3. Automate Your On-Site Optimization Store owners can often be overwhelmed by the sheer volume of their sites. Trying to improve the SEO on such a massive scale can be a daunting task. Instead of being overwhelmed, create a strategy that balances automation with customization. There are a number of elements on a web page that matter to search engines, including page titles, meta descriptions, H1-H6 tags, alt text, internal links, and more. There are also rich snippets, enhanced features such as reviews, star ratings, best use cases, and more that will deliver higher rankings and better click-through rates. Some of these elements can easily be automated. Store owners can also look at their sites and decide to customize top-level pages like the home page, category pages, top products, blog landing page, etc., while automating lesser pages like blog posts or product pages. The Yoast WooCommerce SEO plugin can be especially helpful for automating WooCommerce SEO. You can create standard page titles and meta descriptions using variables, so they’re still useful and SEO-friendly, but you don’t have to rewrite every single one. Invest an hour into automation and see how much you can improve your on-site SEO. 4. Leverage Relationships to Build Backlinks Good backlinks build authority. When different sites link to the same page, that page probably has some helpful info. So search engines pay attention to backlinks. But when we talk about link building, too often we think of spam, black hat SEO, and lame email requests. While stores do need to build links, they should never resort to these tactics. Think of link building like being a good neighbor. You want to be proud of your neighborhood and you want to build good relationships. So focus on those professional relationships. Think about all the different types of business relationships you have: Partners Certifications Directories Customers Sponsorship Media Often it would be entirely appropriate for some of those sources to link to your site. All you have to do is ask. When you’re building on a pre-existing relationship, it’s not weird or awkward or spam. It can be mutually beneficial. But don’t outsource link building. Do it yourself. 5. Give the Search Engines Structure Search engines pay attention to the structure of sites. Too often stores either over-complicate or under-do site structure. It’s especially important for WooCommerce SEO because eCommerce sites can be massive. Search engines need help understanding a site that large. Here are three quick structure best practices: Keep it simple: Don’t get overly complicated. Keep it scalable: Make sure you come up with a structure that can grow as the store grows. Keep it limited: Every page should be no more than three clicks from the homepage. It can often be helpful to map out what a site structure looks like: 5 WooCommerce SEO Tactics So, remember these simple SEO tactics to help improve any WooCommerce store: Start with qualitative keyword research: Remember not to skip the important steps of audience research and brainstorming. Share expertise via content marketing: This one can be overwhelming for store owners, so make it easy for them. Automate your on-site optimization: Try investing an hour and see how much can be automated. Leverage relationships to build backlinks: So many stores are skipping this step, but it can be a powerful way to climb the search engine ranks. Give the search engines structure: This not only helps search engines, but it makes a store easier to navigate for customers. The post How to Make More Money With 5 WooCommerce SEO Tactics appeared first on Liquid Web.

Instagram Pushes IGTV Growth With Horizontal Video

Social Media Examiner -

Welcome to this week’s edition of the Social Media Marketing Talk Show, a news show for marketers who want to stay on the leading edge of social media. On this week’s Social Media Marketing Talk Show, we explore horizontal video support for IGTV and other video broadcasting updates with special guests, Luria Petrucci and David […] The post Instagram Pushes IGTV Growth With Horizontal Video appeared first on Social Media Marketing | Social Media Examiner.

FindMyHost Releases June 2019 Editors’ Choice Awards

My Host News -

OKLAHOMA CITY, OK – Web Hosting Directory and Review site released the June Editor’s Choice Awards for 2019 today. Web Hosting companies strive to provide their customers with the very best service and support. We want to take the opportunity to acknowledge the hosts per category who have excelled in their field. The FindMyHost Editors’ Choice Awards are chosen based on Editor and Consumer Reviews. Customers who wish to submit positive reviews for the current or past Web Host are free to do so by visiting the customer review section of  By doing so, you nominate your web host for next months Editor’s Choice awards. We would like to congratulate all the web hosts who participated and in particular the following who received top honors in their field: Dedicated Servers   Visit  View Report Card Business Hosting ServerWala   Visit  View Report Card European Hosting   Visit  View Report Card VPS   Visit HomepageUniverse  View Report Card Secure Hosting   Visit  View Report Card Cloud Hosting   Visit  View Report Card Hybrid Servers   Visit  View Report Card Budget Hosting Innovative Hosting   Visit InnovativeHosting  View Report Card Enterprise Hosting   Visit  View Report Card Shared Hosting QualityHostOnline   Visit QualityHostOnline  View Report Card Virtual Servers   Visit  View Report Card SSD Hosting   Visit  View Report Card Cloud Servers   Visit  View Report Card Managed Hosting   Visit  View Report Card cPanel Hosting   Visit  View Report Card Website Monitoring   Visit  View Report Card OpenVZ   Visit  View Report Card Blog Hosting RivalHost   Visit  View Report Card About FindMyHost FindMyHost, Inc. is an online magazine that provides editor reviews, consumer hosting news, interviews discussion forums and more. was established in January 2001 to protect web host consumers and web developers from making the wrong choice when choosing a web host. showcases a selection of web hosting companies who have undergone their approved host program testing and provides reviews from customers. FindMyHost’s extensive website can be found at

The June 2019 promo code is near and dear to our hearts Blog -

Summer is almost here and so is our June 2019 promo code, which can help you save on your .com and .net renewals all month long. Use the promo code MAYO June 1-30, 2019 to renew your .com domains for $10.99 and .net domains for $12.99. This promo cannot be used on new registrations or […] The post The June 2019 promo code is near and dear to our hearts appeared first on Blog.

What Can You Expect with an InMotion Hosting Plan?

InMotion Hosting Blog -

Web hosting is a must-have when it comes to your website. And, of course, you want the best hosting provider out there. But with so many options, how do you know which one is right for you? We can’t speak for the other guys, but we can tell you all about what to expect from InMotion Hosting. (And we’ll let you in on a little secret – we think we offer the best service out there!) In this guide, we’re going to go over what a web host is, what you should look for in a hosting service, and what to expect from InMotion. Continue reading What Can You Expect with an InMotion Hosting Plan? at The Official InMotion Hosting Blog.

How to Build Your WordPress Website [Step by Step Guide]

HostGator Blog -

The post How to Build Your WordPress Website [Step by Step Guide] appeared first on HostGator Blog. You’ve decided to build a website. Whether you’re building a site for your physical business, a personal project or you’re gearing up to launch your own online business, then this post is for you. Luckily, you’re reading this today and not ten years ago. Technology and the internet move so fast, that once impossible or laborious tasks can be done in a single afternoon. The same can be said for building your website. Back in the old days of the web, before content management systems, and beginner-friendly website builders existed you’d have to code every aspect of your website yourself. If you didn’t possess the ability to code or didn’t want to learn, then you’d be out of luck. Unless you had the budget to pay a professional of course. But, today’s web is much different. Anyone with the desire to build a website can go from no website to a published website in a single day. Below you’ll learn everything that you’ll need in order to build your first website, including a variety of website building tools you can consider using. The most difficult thing you’ll face isn’t actually building your site but choosing which approach to take. 1. Choose the Right Platform Like we just mentioned you have a ton of different options for actually building your website. Choosing the right approach is important and might be one of the most time-intensive parts of the process. For example, you can build your site entirely from scratch, using languages like HTML, CSS, and more. You can use a website builder to drag and drop your way towards creating a basic website. Or, you can rely upon a CMS to provide you with a customizable framework to build your site from. The best options for beginners are using a website builder or an intuitive CMS like WordPress. If your plan is to build out a fairly basic website, then using a website builder will be the easiest course of action. Website builders are designed with complete beginners in mind. With these the process is simple: You pick a website builder Choose a payment plan for your site and traffic needs Add your domain Select a theme or template Customize your site Publish it live on the internet! If you’re going to take this approach, then consider using the HostGator website builder. If you’re already a HostGator customer and want to create a simple site, then this is a no-brainer. For those with more complex site requirements, then WordPress is the way to go. With WordPress, your customization options are nearly unlimited. Plus, it’s built to help you create content and get the most out of your site. Nearly every single host will let you quickly install WordPress in a couple of clicks. And even though it’s a more complex platform the learning curve is very small, making WordPress an ideal choice for beginners. For the sake of this walkthrough, we’re going to assume you went with WordPress as the CMS you’re going to use to build your site.   2. Secure Your Domain Name and Host In order to have any site live on the Internet, you’re going to need a domain name and host. Without a domain name there’s no way for people to access your site, and without a host, there’s no place to store the files that make up your website. There are ways to get a free domain name and hosting, like using a subdomain of a larger site, like “”. However, this doesn’t look very professional, and you’ll have a hard time building a following without a professional domain name. Same goes for purchasing your own hosting. There are a ton of different options to choose from, but for those building their first sites, a basic shared hosting plan will suit you just fine. In time you may want to upgrade your package, but there’s no need to complicate things from the start. With shared hosting, you’re sharing server resources with other sites using the same server. This effectively spits costs between a ton of different users, so your monthly bill will be very low. Like anything online, there are multiple ways you can secure your domain name and hosting. You can spend time researching all your available options, or to keep things simple, just follow the process below. Here’s how you can purchase a shared hosting plan from HostGator. First, head over to this page and select your plan. On the next screen you’ll be asked to enter more information, and even have the option to add a domain name. Complete the steps, enter your payment information, and you’re all set. If you have an existing domain name, then you’ll have to point it towards your new host. You also have the option to purchase your domain name from another provider entirely. However, if you’re already purchasing hosting from HostGator it’s much easier to add a domain to your order.   3. Install Your CMS on Your Host With your hosting and domain name taken care of, it’s time to install WordPress. Most beginner hosts make this easy by including software that lets you install your CMS of choice with a few clicks. Instead of having to download your CMS yourself and upload it to yourself, this software will automate all of those technical tasks. To do this we’re going to need to access your server. If you went with a host like HostGator, then we’ll be doing this with a tool called cPanel. This software is installed on your server and makes it incredibly easy to manage your server environment. Here’s how to do it: First, log in to your server via your control panel. You should have been provided your login link, along with your username and password when you signed up. Next, locate a tool called ‘QuickInstall’. You’ll see that there are numerous other content management systems you can install, but for this tutorial, we’re going to be using WordPress. So, select ‘WordPress’, enter your relevant site details and click ‘Install’. Once the installer is finished WordPress will be installed on your site. Just a few more steps and your site will be ready for the world.   4. Choose Your Theme WordPress will form the foundation of your site. But, in order to make any customizations to how your site looks you’re going to need to install a WordPress theme. Essentially, a theme is a collection of files that determine how your site looks and functions. You can think of WordPress as the foundation and scaffolding for your house—it’s basic structure. While your theme is the color your home, your wood floors—what it looks like. Since the WordPress ecosystem is so large you’ll have thousands of different themes to choose from (and that’s just including the free themes). There are also premium themes that provide you with even greater functionality, features, support, and a lot more. Premium themes generally look and function better than paid themes, have higher-quality code, and have dedicated support teams behind them. The cool thing about WordPress is that you can switch up themes at any time. It won’t have any effect on your existing blog posts, pages, and other media. If you’ve done a lot of customizing it might not display properly, but your content isn’t going anywhere. Here’s how you can select and install a theme on your site: From the backend of your WordPress dashboard navigate to Appearance>Themes. Then click the button that says ‘Add New’. Here you’ll be able to browse the massive selection of free WordPress themes or even search for a theme. Once you’ve found a theme that you like hover over it and click ‘Install’, then ‘Activate’. Now you’ll be able to customize your theme to your liking. If you want to use a Premium theme, then the installation process will be a little different. First, you’ll need to purchase and download a theme. There are dozens of places you can purchase high-quality themes online, including: Elegant Themes StudioPress ThemeForest Once you purchase and download your theme you’ll have a .zip file that contains all of the theme’s files. Now, you’re going to upload this file to WordPress. Navigate to Appearance>Themes, then select ‘Add New’. On the next screen select ‘Upload Theme’, then either drag and drop or locate the .zip file on your computer. WordPress will then upload and install your theme, just click ‘Activate’ and you’re all set.   If you want to more features to your site, then you’ll be relying on the nearly endless library of WordPress plugins. There’s a wide variety of both free and premium plugins that can help you add whatever features you desire to your site. To install any plugins navigate to Plugins>Add New. Here you can browse by popular plugins, recommended plugins, or search for a plugin you’ve found online.   5. Customize Your Site By now you’ve gotten your domain name and hosting, installed WordPress, selected your theme, and maybe even installed a few plugins. Now is where the fun really begins—it’s time to customize your site. The first thing you’ll want to do is create a few necessary pages. Home page Contact page About page Blog page Depending on your site there are probably additional pages you’ll want to create as well, like a services page, resource page, or anything else really. Adding pages to your WordPress site is very easy. Once you’re in the backend of your site navigate to Pages>Add New. This will open up a screen where you can add a page to your site. Just add your title, text, any images, or media, then click ‘Publish’. If you want to add any blogs to your site, then you’ll be following a similar process. Click on Posts>Add New, and you’ll be taken to a screen that looks nearly identical to the page editor. Write your blog post and click ‘Publish’. The approach you take to customize your theme depends upon what theme you’re using. For example, your theme might have its own customization options which you’ll control from a different tab. If you’ve purchased a premium theme, then check the documentation that came with your theme to see how to make customizations. However, you’ll also be able to make general customizations by navigating to Appearance>Customize. On this screen, you’ll be able to do things like change your site’s color scheme, header elements, menus, general theme settings and a lot more.   6. Launch! Once you’re satisfied with how your site looks and functions it’s time to launch your site. If you’ve done everything above then your site should already be online! Just type in your URL to the browser and you should see your site live. If you want to build out your site without people being able to see it in an unfinished state, then you might want to use a coming soon, or maintenance mode plugin. To do this you can install a plugin like ‘Coming Soon Page & Maintenance Mode by SeedProd’. Head over to Plugins>Add New, then search for ‘coming soon seedpod’. Then, install a plugin that looks like the one below: Once you’ve activated the plugin head over to Settings>Coming Soon Page & Maintenance Mode, here you’ll be able to customize the appearance of your coming soon page. Then, when you’re ready to launch just make sure to disable the plugin.   Closing Thoughts Building your website isn’t as difficult as it once was. If you followed the steps above you’ll be on your way towards having a fully functional website live on the web. Note that there are multiple approaches you can take, some are easier than others. For example, if you’re already using HostGator hosting, then one of the simplest options will be using the bundled website builder. This tool is very straightforward and intuitive to use and is equipped with a variety of templates to suit your needs. For those looking to create a more robust website, then consider using WordPress. This widely used CMS is the foundation for a lot of the largest and highly-trafficked sites on the entire Internet. With its flexibility and ease of use, you can create whatever style of site you desire. The route you choose is up to you. Just make sure you take stock of your needs, current skills, and overall goals of your site before you choose the best approach for building out your site. Find the post on the HostGator Blog

Women in Technology: Lindsey Miller

Liquid Web Official Blog -

Liquid Web’s Partner Marketing Manager on building community, nurturing relationships, and putting her time to good use. “I know that I make a difference in people’s businesses,” says Miller, “and that motivates me to come to work every day and do a great job.” Lindsey Miller is no stranger to enterprise. “I started my first business in kindergarten!” Miller made seasonal crafts— turkeys drawn from the outline of her hand, Christmas trees— and sold them to family members over holiday meals. “Yes,” she says, “I sold my grandparents my drawings instead of giving them away!” That Miller was so resourceful at such a young age is unsurprising, having grown up on 200 acres in Oologah, Oklahoma, the birthplace of Oklahoma’s Favorite Son, Will Rogers. “My conversations around cattle can surprise a lot of people,” she says. Her tech journey began almost ten years ago when she was working as a political fundraiser and met her now-husband, Cory. “He had a WordPress plugin company and he got me started by blogging about politics.” Then, in 2011, Miller started a non-profit called The Div, teaching kids to code. Her path in tech was solidified after that, working with WordPress and empowering businesses around the use of the platform. Though she now lives a few hours away from Oologah in Oklahoma City, Lindsey Miller puts her ingenuity to use as Liquid Web’s Partner Marketing Manager, investing in community and relationships. “I have been involved in the WordPress community for a long time,” says Miller. “I truly care about WordPress and those who build their businesses around it.” She takes pride that now, in her role at Liquid Web, she gets to help those who rely on WordPress to grow. Miller loves working in tech for the innovation that it entails. “I was a part of the team that brought the very first WooCommerce Hosting product to market. There was so much creativity! At Liquid Web, we’re encouraged to think outside of the box. That’s very exciting to me,” she says. But for Miller, success is about more than inventiveness. It’s about people. She loves exploring ways she can help those who turn to Liquid Web as they build their business. Miller is currently creating education opportunities like webinars, documents, and blueprints which businesses can use to reach their goals and increase revenue. Miller wants to build a community around people who create on the web and take Liquid Web beyond just being a hosting company. “If I can help our partners and their businesses,” she says, “then I feel that I will have accomplished something great. I have a strong perspective on how to build a relationship and create a community. It starts with caring about people over profit.” She recognizes the power of community and strong relationships in her own life, as well. “Much of my success, I attribute to people who believed in me.” She credits the many mentors, leaders, and colleagues who inspired and taught her along the way. “I am lucky to have had many wonderful advisors put me under their wings over the years and help me continue to grow and learn,” she says. Among those who have impacted her profoundly in both her personal and professional life are her husband, Cory— “He is the reason I learned as much as I have to get where I am in my career.” — and the leadership team at Liquid Web. “As my first real corporate job, I did not expect to get much attention from anyone other than my team and supervisors,” says Miller, “but I regularly connect with our leadership team including tremendous women like Terry Trout and Carrie Wheeler, among all of the talented colleagues that I learn from every day. I have grown tenfold since starting at Liquid Web two-and-a-half years ago.” Chris Lema, Liquid Web’s VP of Products and Innovation, has also been instrumental in Miller’s growth, challenging her to continue expanding her skill set. “If it wasn’t for Chris recognizing the skills learned in politics and developed during my time with Cory, I wouldn’t be here.” It’s been an important two-and-a-half years for Miller who takes great care about how she spends her time and who she spends it with. “Time is so precious,” she says. “I don’t want to waste it.” This outlook will come as no surprise to her colleagues. Spending her formative work years in politics, Miller learned to work quickly and diligently, always under a deadline. She jokes about the impact those experiences have had on her work style. “When asked when I need something, I tease my co-workers that my answer is always ‘as soon as possible’. Maybe my next career lesson will be learning how to wait.” Miller encourages young women considering a career in tech to focus on building relationships. “You will get further in life and work by growing with others instead of in spite of others or on the backs of others. Create relationships. Champion other people, as well as yourself. Working together makes everything better, personally and professionally.” A career in tech, she says, is also an exciting way to see the palpable outcome of hard work. “A great thing about working in tech is that there are not arbitrary results. What you do and your work product is there for everyone to see. For young women, it is a very tangible way to work towards something that takes intelligence and creativity.” And, Miller says, tech offers incredible space for growth. “It is a vast industry. The opportunities are endless.” The post Women in Technology: Lindsey Miller appeared first on Liquid Web.

How to Create Content That Attracts Customers

Social Media Examiner -

Do you need a better content marketing plan? Wondering how to improve your content strategy? To explore creative ways to regularly create content, I interview Melanie Deziel. Melanie is a former journalist, storytelling expert, and founder of StoryFuel, a company that helps marketers become better storytellers. Discover different ways to create content both on and […] The post How to Create Content That Attracts Customers appeared first on Social Media Marketing | Social Media Examiner.

Understanding the Architecture and Setup of VPS Hosting

Reseller Club Blog -

From our previous articles on what VPS Hosting is, to types of VPS Hosting, or how to install or enable a certain plugin, etc. we’ve covered it all, however, there are two things that we haven’t, that is the architecture and setup. For anything to be built or function properly there needs to be a process or an architecture in place. And it is true even when it comes to your web hosting. The aim of this article is to make you the reader understand how VPS Hosting works and how to setup VPS Hosting on your hosting package. What is VPS Hosting? VPS (Virtual Private Server) Hosting, is the kind of a server that hosts several websites on a single physical server but gives the user the experience of an isolated server. Here, each individual server gets its own resources like CPU, RAM and OS with users having complete root access. Thus, VPS Hosting is said to be a combination of Shared and Dedicated Hosting. Working of VPS Hosting To segregate your physical server into multiple virtual servers, your hosting provider requires a virtualization software, known as a hypervisor. The hypervisor acts as a virtualization layer. It essentially extracts resources on the physical server and lets your customers have access to a virtual replica of the original server. This server is known as a Virtual Machine (VM). Each VM has its own dedicated resources like CPU, RAM, OS and individual applications. As you can see from the above diagram, in the virtual architecture a single physical server is divided into three separate servers and there is a layer of virtualization between the operating system and the physical server. Also, all these servers are isolated from each other. The advantage of VPS Hosting is that each user has full root access due to the isolated nature of the servers, which ensures privacy and better security. Now that we’ve seen how VPS Hosting works let us now move onto understanding how to set up a Virtual Private Server. For our benefit, we will be provisioning VPS on a ResellerClub hosting package. Let’s begin! Setting up VPS Hosting Login to your Reseller AccountLogin to our ResellerClub Control Panel, using your Reseller ID and Password. Go to the top right side of the dashboard and click on Buy to purchase orders. Place an Order In order to purchase VPS Hosting you first need to have a domain name linked to it. For your benefit, we would be purchasing both the Domain and VPS Hosting. Purchasing Domain Name To purchase a domain, go to ‘Select Product’ and select Domain Registration from the drop-down list Enter the domain you want and check if it is available. Should you want Privacy Protection you can add it at an added cost Purchasing VPS Hosting After you’ve purchased your domain name it is time for you to link it to your choice of hosting. Refresh the page and in the same ‘Select Product’ drop-down, select Linux KVM VPS Type the domain name you want to link the hosting with, as well as, all the product specification details as well (we will link it with the domain we purchased) Next, choose if you want any Add-ons, the control panels viz. cPanel and Plesk and, WHMCS (Billing) Add-on available with VPS Hosting. We have selected cPanel and WHMCS. If you don’t want any Add-On select None Accessing your VPS Hosting Post purchasing your domain name and VPS Hosting are now automatically added to your control panel To access the orders, go to the main dashboard and click on Products → List All Orders → Click on the Order you want to access. We will be choosing VPS Hosting Setting up your VPS HostingWith ResellerClub, your VPS server is provisioned instantly post purchase of the order and you need not set it up manually. To access your VPS server, click on the ‘Admin Details’ tab and a new window opens. You can now access the Server Management Panel, WHMCS and cPanel to manage your orders. Conclusion: With this, we come to an end on our series of VPS Hosting. We hope you now know how VPS Hosting works, as well as, how to setup VPS Hosting. With ResellerClub, setting up VPS is very easy. If you have any suggestions, queries, or questions feel free to leave a comment below and we’ll get back to you. .fb_iframe_widget_fluid_desktop iframe { width: 100% !important; } The post Understanding the Architecture and Setup of VPS Hosting appeared first on ResellerClub Blog.

Join Cloudflare India Forum in Bangalore on 6 June 2019!

CloudFlare Blog -

Please join us for an exclusive gathering to discover the latest in cloud solutions for Internet Security and Performance.Cloudflare Bangalore MeetupThursday, 6 June, 2019:  15:30 - 20:00Location: the Oberoi (37-39, MG Road, Yellappa Garden, Yellappa Chetty Layout, Sivanchetti Gardens, Bengalore)We will discuss the newest security trends and introduce serverless solutions.We have invited renowned leaders across industries, including big brands and some of the fastest-growing startups. You will  learn the insider strategies and tactics that will help you to protect your business, to accelerate the performance and to identify the quick-wins in a complex internet environment.Speakers:Vaidik Kapoor, Head of Engineering, GrofersNithyanand Mehta, VP of Technical Services & GM India, CatchpointViraj Patel, VP of Technology, BookmyshowKailash Nadh, CTO, ZerodhaTrey Guinn, Global Head of Solution Engineering, CloudflareAgenda:15:30 - 16:00 - Registration and Refreshment16:00 - 16:30 - DDoS Landscapes and Security Trends16:30 - 17:15 - Workers Overview and Demo17:15 - 18:00 - Panel Discussion - Best Practice on Successful Cyber Security and Performance Strategy18:00 - 18:30 - Keynote #1 - Future edge computing18:30 - 19:00 -  Keynote # 2 - Cyber attacks are evolving, so should you: How to adopt a quick-win security policy19:00 - 20:00 - Happy HourView Event Details & Register Here »We look forward to meeting you there!

Amazon Managed Streaming for Apache Kafka (MSK) – Now Generally Available

Amazon Web Services Blog -

I am always amazed at how our customers are using streaming data. For example, Thomson Reuters, one of the world’s most trusted news organizations for businesses and professionals, built a solution to capture, analyze, and visualize analytics data to help product teams continuously improve the user experience. Supercell, the social game company providing games such as Hay Day, Clash of Clans, and Boom Beach, is delivering in-game data in real-time, handling 45 billion events per day. Since we launched Amazon Kinesis at re:Invent 2013, we have continually expanded the ways in in which customers work with streaming data on AWS. Some of the available tools are: Kinesis Data Streams, to capture, store, and process data streams with your own applications. Kinesis Data Firehose, to transform and collect data into destinations such as Amazon S3, Amazon Elasticsearch Service, and Amazon Redshift. Kinesis Data Analytics, to continuously analyze data using SQL or Java (via Apache Flink applications), for example to detect anomalies or for time series aggregation. Kinesis Video Streams, to simplify processing of media streams. At re:Invent 2018, we introduced in open preview Amazon Managed Streaming for Apache Kafka (MSK), a fully managed service that makes it easy to build and run applications that use Apache Kafka to process streaming data. I am excited to announce that Amazon MSK is generally available today! How it works Apache Kafka (Kafka) is an open-source platform that enables customers to capture streaming data like click stream events, transactions, IoT events, application and machine logs, and have applications that perform real-time analytics, run continuous transformations, and distribute this data to data lakes and databases in real time. You can use Kafka as a streaming data store to decouple applications producing streaming data (producers) from those consuming streaming data (consumers). While Kafka is a popular enterprise data streaming and messaging framework, it can be difficult to setup, scale, and manage in production. Amazon MSK takes care of these managing tasks and makes it easy to set up, configure, and run Kafka, along with Apache ZooKeeper, in an environment following best practices for high availability and security. Your MSK clusters always run within an Amazon VPC managed by the MSK service. Your MSK resources are made available to your own VPC, subnet, and security group through elastic network interfaces (ENIs) which will appear in your account, as described in the following architectural diagram: Customers can create a cluster in minutes, use AWS Identity and Access Management (IAM) to control cluster actions, authorize clients using TLS private certificate authorities fully managed by AWS Certificate Manager (ACM), encrypt data in-transit using TLS, and encrypt data at rest using AWS Key Management Service (KMS) encryption keys. Amazon MSK continuously monitors server health and automatically replaces servers when they fail, automates server patching, and operates highly available ZooKeeper nodes as a part of the service at no additional cost. Key Kafka performance metrics are published in the console and in Amazon CloudWatch. Amazon MSK is fully compatible with Kafka versions 1.1.1 and 2.1.0, so that you can continue to run your applications, use Kafka’s admin tools, and and use Kafka compatible tools and frameworks without having to change your code. Based on our customer feedback during the open preview, Amazon MSK added may features such as: Encryption in-transit via TLS between clients and brokers, and between brokers Mutual TLS authentication using ACM private certificate authorities Support for Kafka version 2.1.0 99.9% availability SLA HIPAA eligible Cluster-wide storage scale up Integration with AWS CloudTrail for MSK API logging Cluster tagging and tag-based IAM policy application Defining custom, cluster-wide configurations for topics and brokers AWS CloudFormation support is coming in the next few weeks. Creating a cluster Let’s create a cluster using the AWS management console. I give the cluster a name, select the VPC I want to use the cluster from, and the Kafka version. I then choose the Availability Zones (AZs) and the corresponding subnets to use in the VPC. In the next step, I select how many Kafka brokers to deploy in each AZ. More brokers allow you to scale the throughtput of a cluster by allocating partitions to different brokers. I can add tags to search and filter my resources, apply IAM policies to the Amazon MSK API, and track my costs. For storage, I leave the default storage volume size per broker. I select to use encryption within the cluster and to allow both TLS and plaintext traffic between clients and brokers. For data at rest, I use the AWS-managed customer master key (CMK), but you can select a CMK in your account, using KMS, to have further control. You can use private TLS certificates to authenticate the identity of clients that connect to your cluster. This feature is using Private Certificate Authorities (CA) from ACM. For now, I leave this option unchecked. In the advanced setting, I leave the default values. For example, I could have chosen here a different instance type for my brokers. Some of these settings can be updated using the AWS CLI. I create the cluster and monitor the status from the cluster summary, including the Amazon Resource Name (ARN) that I can use when interacting via CLI or SDKs. When the status is active, the client information section provides specific details to connect to the cluster, such as: The bootstrap servers I can use with Kafka tools to connect to the cluster. The Zookeper connect list of hosts and ports. I can get similar information using the AWS CLI: aws kafka list-clusters to see the ARNs of your clusters in a specific region aws kafka get-bootstrap-brokers --cluster-arn <ClusterArn> to get the Kafka bootstrap servers aws kafka describe-cluster --cluster-arn <ClusterArn> to see more details on the cluster, including the Zookeeper connect string Quick demo of using Kafka To start using Kafka, I create two EC2 instances in the same VPC, one will be a producer and one a consumer. To set them up as client machines, I download and extract the Kafka tools from the Apache website or any mirror. Kafka requires Java 8 to run, so I install Amazon Corretto 8. On the producer instance, in the Kafka directory, I create a topic to send data from the producer to the consumer: bin/ --create --zookeeper <ZookeeperConnectString> \ --replication-factor 3 --partitions 1 --topic MyTopic Then I start a console-based producer: bin/ --broker-list <BootstrapBrokerString> \ --topic MyTopic On the consumer instance, in the Kafka directory, I start a console-based consumer: bin/ --bootstrap-server <BootstrapBrokerString> \ --topic MyTopic --from-beginning Here’s a recording of a quick demo where I create the topic and then send messages from a producer (top terminal) to a consumer of that topic (bottom terminal): Pricing and availability Pricing is per Kafka broker-hour and per provisioned storage-hour. There is no cost for the Zookeeper nodes used by your clusters. AWS data transfer rates apply for data transfer in and out of MSK. You will not be charged for data transfer within the cluster in a region, including data transfer between brokers and data transfer between brokers and ZooKeeper nodes. You can migrate your existing Kafka cluster to MSK using tools like MirrorMaker (that comes with open source Kafka) to replicate data from your clusters into a MSK cluster. Upstream compatibility is a core tenet of Amazon MSK. Our code changes to the Kafka platform are released back to open source. Amazon MSK is available in US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Tokyo), Asia Pacific (Singapore), Asia Pacific (Sydney), EU (Frankfurt), EU (Ireland), EU (Paris), and EU (London). I look forward to see how are you going to use Amazon MSK to simplify building and migrating streaming applications to the cloud!

Now Available – AWS IoT Things Graph

Amazon Web Services Blog -

We announced AWS IoT Things Graph last November and described it as a tool to let you build IoT applications visually. Today I am happy to let you know that the service is now available and ready for you to use! As you will see in a moment, you can represent your business logic in a flow composed of devices and services. Each web service and each type of device (sensor, camera, display, and so forth) is represented in Things Graph as a model. The models hide the implementation details that are peculiar to a particular brand or model of device, and allow you to build flows that can evolve along with your hardware. Each model has a set of actions (inputs), events (outputs), and states (attributes). Things Graph includes a set of predefined models, and also allows you to define your own. You can also use mappings as part of your flow to convert the output from one device into the form expected by other devices. After you build your flow, you can deploy it to the AWS Cloud or an AWS IoT Greengrass-enabled device for local execution. The flow, once deployed, orchestrates interactions between locally connected devices and web services. Using AWS IoT Things Graph Let’s take a quick walk through the AWS IoT Things Graph Console! The first step is to make sure that I have models which represent the devices and web services that I plan to use in my flow. I click Models in the console navigation to get started: The console outlines the three steps that I must follow to create a model, and also lists my existing models: The presence of aws/examples in the URN for each of the devices listed above indicates that they are predefined, and part of the public AWS IoT Things Graph namespace. I click on Camera to learn more about this model; I can see the Properties, Actions, and Events: The model is defined using GraphQL; I can view it, edit it, or upload a file that contains a model definition. Here’s the definition of the Camera: This model defines an abstract Camera device. The model, in turn, can reference definitions for one or more actual devices, as listed in the Devices section: Each of the devices is also defined using GraphQL. Of particular interest is the use of MQTT topics & messages to define actions: Earlier, I mentioned that models can also represent web services. When a flow that references a model of this type is deployed, activating an action on the model invokes a Greengrass Lambda function. Here’s how a web service is defined: Now I can create a flow. I click Flows in the navigation, and click Create flow: I give my flow a name and enter a description: I start with an empty canvas, and then drag nodes (Devices, Services, or Logic) to it: For this demo (which is fully explained in the AWS IoT Things Graph User Guide), I’ll use a MotionSensor, a Camera, and a Screen: I connect the devices to define the flow: Then I configure and customize it. There are lots of choices and settings, so I’ll show you a few highlights, and refer you to the User Guide for more info. I set up the MotionSensor so that a change of state initiates this flow: I also (not shown) configure the Camera to perform the Capture action, and the Screen to display it. I could also make use of the predefined Services: I can also add Logic to my flow: Like the models, my flow is ultimately defined in GraphQL (I can view and edit it directly if desired): At this point I have defined my flow, and I click Publish to make it available for deployment: The next steps are: Associate – This step assigns an actual AWS IoT Thing to a device model. I select a Thing, and then choose a device model, and repeat this step for each device model in my flow: Deploy – I create a Flow Configuration, target it at the Cloud or Greengrass, and use it to deploy my flow (read Creating Flow Configurations to learn more). Things to Know I’ve barely scratched the surface here; AWS IoT Things Graph provides you with a lot of power and flexibility and I’ll leave you to discover more on your own! Here are a couple of things to keep in mind: Pricing – Pricing is based on the number of steps executed (for cloud deployments) or deployments (for edge deployments), and is detailed on the AWS IoT Things Graph Pricing page. API Access – In addition to console access, you can use the AWS IoT Things Graph API to build your models and flows. Regions – AWS IoT Things Graph is available in the US East (N. Virginia), US West (Oregon), Europe (Ireland), Asia Pacific (Sydney), and Asia Pacific (Tokyo) Regions. — Jeff;    

New – Data API for Amazon Aurora Serverless

Amazon Web Services Blog -

If you have ever written code that accesses a relational database, you know the drill. You open a connection, use it to process one or more SQL queries or other statements, and then close the connection. You probably used a client library that was specific to your operating system, programming language, and your database. At some point you realized that creating connections took a lot of clock time and consumed memory on the database engine, and soon after found out that you could (or had to) deal with connection pooling and other tricks. Sound familiar? The connection-oriented model that I described above is adequate for traditional, long-running programs where the setup time can be amortized over hours or even days. It is not, however, a great fit for serverless functions that are frequently invoked and that run for time intervals that range from milliseconds to minutes. Because there is no long-running server, there’s no place to store a connection identifier for reuse. Aurora Serverless Data API In order to resolve this mismatch between serverless applications and relational databases, we are launching a Data API for the MySQL-compatible version of Amazon Aurora Serverless. This API frees you from the complexity and overhead that come along with traditional connection management, and gives you the power to quickly and easily execute SQL statements that access and modify your Amazon Aurora Serverless Database instances. The Data API is designed to meet the needs of both traditional and serverless apps. It takes care of managing and scaling long-term connections to the database and returns data in JSON form for easy parsing. All traffic runs over secure HTTPS connections. It includes the following functions: ExecuteStatement – Run a single SQL statement, optionally within a transaction. BatchExecuteStatement – Run a single SQL statement across an array of data, optionally within a transaction. BeginTransaction – Begin a transaction, and return a transaction identifier. Transactions are expected to be short (generally 2 to 5 minutes). CommitTransaction – End a transaction and commit the operations that took place within it. RollbackTransaction – End a transaction without committing the operations that took place within it. Each function must run to completion within 1 minute, and can return up to 1 megabyte of data. Using the Data API I can use the Data API from the Amazon RDS Console, the command line, or by writing code that calls the functions that I described above. I’ll show you all three in this post. The Data API is really easy to use! The first step is to enable it for the desired Amazon Aurora Serverless database. I open the Amazon RDS Console, find & select the cluster, and click Modify: Then I scroll down to the Network & Security section, click Data API, and Continue: On the next page I choose to apply the settings immediately, and click Modify cluster: Now I need to create a secret to store the credentials that are needed to access my database. I open the Secrets Manager Console and click Store a new secret. I leave Credentials for RDS selected, enter a valid database user name and password, optionally choose a non-default encryption key, and then select my serverless database. Then I click Next: I name my secret and tag it, and click Next to configure it: I use the default values on the next page, click Next again, and now I have a brand new secret: Now I need two ARNs, one for the database and one for the secret. I fetch both from the console, first for the database: And then for the secret: The pair of ARNs (database and secret) provides me with access to my database, and I will protect them accordingly! Using the Data API from the Amazon RDS Console I can use the Query Editor in the Amazon RDS Console to run queries that call the Data API. I open the console and click Query Editor, and create a connection to the database. I select the cluster, enter my credentials, and pre-select the table of interest. Then I click Connect to database to proceed: I enter a query and click Run, and view the results within the editor: Using the Data API from the Command Line I can exercise the Data API from the command line: $ aws rds-data execute-statement \ --secret-arn "arn:aws:secretsmanager:us-east-1:123456789012:secret:aurora-serverless-data-api-sl-admin-2Ir1oL" \ --resource-arn "arn:aws:rds:us-east-1:123456789012:cluster:aurora-sl-1" \ --database users \ --sql "show tables" \ --output json I can use jq to pick out the part of the result that is of interest to me: ... | jq .records [ { "values": [ { "stringValue": "users" } ] } ] I can query the table and get the results (the SQL statement is "select * from users where userid='jeffbarr'"): ... | jq .records [ { "values": [ { "stringValue": "jeffbarr" }, { "stringValue": "Jeff" }, { "stringValue": "Barr" } ] } If I specify --include-result-metadata, the query also returns data that describes the columns of the result (I’ll show only the first one in the interest of frugality): ... | jq .columnMetadata[0] { "type": 12, "name": "userid", "label": "userid", "nullable": 1, "isSigned": false, "arrayBaseColumnType": 0, "scale": 0, "schemaName": "", "tableName": "users", "isCaseSensitive": false, "isCurrency": false, "isAutoIncrement": false, "precision": 15, "typeName": "VARCHAR" } The Data API also allows me to wrap a series of statements in a transaction, and then either commit or rollback. Here’s how I do that (I’m omitting --secret-arn and --resource-arn for clarity): $ $ID=`aws rds-data begin-transaction --database users --output json | jq .transactionId` $ echo $ID "ATP6Gz88GYNHdwNKaCt/vGhhKxZs2QWjynHCzGSdRi9yiQRbnrvfwF/oa+iTQnSXdGUoNoC9MxLBwyp2XbO4jBEtczBZ1aVWERTym9v1WVO/ZQvyhWwrThLveCdeXCufy/nauKFJdl79aZ8aDD4pF4nOewB1aLbpsQ==" $ aws rds-data execute-statement --transaction-id $ID --database users --sql "..." $ ... $ aws rds-data execute-statement --transaction-id $ID --database users --sql "..." $ aws rds-data commit-transaction $ID If I decide not to commit, I invoke rollback-transaction instead. Using the Data API with Python and Boto Since this is an API, programmatic access is easy. Here’s some very simple Python / Boto code: import boto3 client = boto3.client('rds-data') response = client.execute_sql( secretArn = 'arn:aws:secretsmanager:us-east-1:123456789012:secret:aurora-serverless-data-api-sl-admin-2Ir1oL', database = 'users', resourceArn = 'arn:aws:rds:us-east-1:123456789012:cluster:aurora-sl-1', sql = 'select * from users' ) for user in response['records']: userid = user[0]['stringValue'] first_name = user[1]['stringValue'] last_name = user[2]['stringValue'] print(userid + ' ' + first_name + ' ' + last_name) And the output: $ python jeffbarr Jeff Barr carmenbarr Carmen Barr Genuine, production-quality code would reference the table columns symbolically using the metadata that is returned as part of the response. By the way, my Amazon Aurora Serverless cluster was configured to scale capacity all the way down to zero when not active. Here’s what the scaling activity looked like while I was writing this post and running the queries: Now Available You can make use of the Data API today in the US East (N. Virginia), US East (Ohio), US West (Oregon), Asia Pacific (Tokyo), and Europe (Ireland) Regions. There is no charge for the API, but you will pay the usual price for data transfer out of AWS. — Jeff;

Why (and how) WordPress Works for Us

WP Engine -

Today WP Engine is the digital experience platform for WordPress used by 95,000 customers across 150 countries. But, we didn’t start that way. In 2010, I founded WP Engine based on the knowledge that there was a need for premium WordPress service that would deliver the speed, scalability, and security that websites required. The idea… The post Why (and how) WordPress Works for Us appeared first on WP Engine.

New – AWS IoT Events: Detect and Respond to Events at Scale

Amazon Web Services Blog -

As you may have been able to tell from many of the announcements that we have made over the last four or five years, we are working to build a wide-ranging set of Internet of Things (IoT) services and capabilities. Here’s a quick recap: October 2015 – AWS IoT Core – A fundamental set of Cloud Services for Connected Devices. Jun 2017 – AWS Greengrass – The ability to Run AWS Lambda Functions on Connected Devices. November 2017 – AWS IoT Device Management – Onboarding, Organization, Monitoring, and Remote Management of Connected Devices. November 2017 – AWS IoT Analytics – Advanced Data Analysis for IoT Devices. November 2017 – Amazon FreeRTOS – An IoT Operating System for Microcontrollers. April 2018 – Greengrass ML Inference – The power to do Machine Learning Inference at the Edge. August 2018 – AWS IoT Device Defender – A service that helps to Keep Your Connected Devices Safe. Last November we also announced our plans to launch four new IoT Services: AWS IoT SiteWise to collect, structure, and search data from industrial equipment at scale. AWS IoT Events to detect and respond to events at scale. AWS IoT Things Graph to build IoT applications visually. AWS IoT Greengrass Connectors to simplify and accelerate the process of connecting devices. You can use these services individually or together to build all sorts of powerful, connected applications! AWS IoT Events Now Available Today we are making AWS IoT Events available in production form in four AWS Regions. You can use this service to monitor and respond to events (patterns of data that identify changes in equipment or facilities) at scale. You can detect a misaligned robot arm, a motion sensor that triggers outside of business hours, an unsealed freezer door, or a motor that is running outside of tolerance, all with the goal of driving faster and better-informed decisions. As you will see in a moment, you can easily create detector models that represent your devices, their states, and the transitions (driven by sensors and events, both known as inputs) between the states. The models can trigger actions when critical events are detected, allowing you to build robust, highly automated systems. Actions can, for example, send a text message to a service technician or invoke an AWS Lambda function. You can access AWS IoT Events from the AWS IoT Event Console or by writing code that calls the AWS IoT Events API functions. I’ll use the Console, and I will start by creating a detector model. I click Create detector model to begin: I have three options; I’ll go with the demo by clicking Launch demo with inputs: This shortcut creates an input and a model, and also enables some “demo” functionality that sends data to the model. The model looks like this: Before examining the model, let’s take a look at the input. I click on Inputs in the left navigation to see them: I can see all of my inputs at a glance; I click on the newly created input to learn more: This input represents the battery voltage measured from a device that is connected to a particular powerwallId: Ok, let’s return to (and dissect) the detector model! I return to the navigation, click Detector models, find my model, and click it: There are three Send options at the top; each one sends data (an input) to the detector model. I click on Send data for Charging to get started. This generates a message that looks like this; I click Send data to do just that: Then I click Send data for Charged to indicate that the battery is fully charged. The console shows me the state of the detector: Each time an input is received, the detector processes it. Let’s take a closer look at the detector. It has three states (Charging, Charged, and Discharging): The detector starts out in the Charging state, and transitions to Charged when the Full_charge event is triggered. Here’s the definition of the event, including the trigger logic: The trigger logic is evaluated each time an input is received (your IoT app must call BatchPutMessage to inform AWS IoT Events). If the trigger logic evaluates to a true condition, the model transitions to the new (destination) state, and it can also initiate an event action. This transition has no actions; I can add one (or more) by clicking Add action. My choices are: Send MQTT Message – Send a message to an MQTT topic. Send SNS Message – Send a message to an SNS target, identifed by an ARN. Set Timer – Set, reset, or destroy a timer. Times can be expressed in seconds, minutes, hours, days, or months. Set Variable – Set, increment, or decrement a variable. Returning (once again) to the detector, I can modify the states as desired. For example, I could fine-tune the Discharging aspect of the detector by adding a LowBattery state: After I create my inputs and my detector, I Publish the model so that my IoT devices can use and benefit from it. I click Publish and fill in a few details: The Detector generation method has two options. I can Create a detector for each unique key value (if I have a bunch of devices), or I can Create a single detector (if I have one device). If I choose the first option, I need to choose the key that separates one device from another. Once my detector has been published, I can send data to it using AWS IoT Analytics, IoT Core, or from a Lambda function. Get Started Today We are launching AWS IoT Events in the US East (N. Virginia), US East (Ohio), US West (Oregon), and Europe (Ireland) Regions and you can start using it today! — Jeff;    

Reseller Hosting Is Your Business Model Delivered

InMotion Hosting Blog -

Our Reseller Hosting is a Linux-powered open-source juggernaut. Yes, we’re proud of it. And we stand behind the features and addons that make our service not only meet but exceed industry standards and expectations. If you’re not familiar with the “reseller” hosting model and how it differs from an individual hosting account, we’re going to go through it point by point. Reseller Hosting is a Model For Your Own Hosting Business Have you ever though about starting your own business? Continue reading Reseller Hosting Is Your Business Model Delivered at The Official InMotion Hosting Blog.


Recommended Content

Subscribe to Complete Hosting Guide aggregator