Industry Buzz

What Is VPS Used For?

HostGator Blog -

The post What Is VPS Used For? appeared first on HostGator Blog. You may be here because you’ve heard the term VPS thrown around a lot, and you’re wondering what this acronym actually means. VPS stands for Virtual Private Server, and the term is usually used when referring to VPS hosting. It’s also often confused with VPN, although VPS and VPN are two different things. Of course, there’s a lot more you can do with a VPS server than just host a website, but we’ll get into that below. VPS hosting is typically a natural next step after you’ve run into the limitations of a traditional shared hosting plan. Below we’ll answer the questions what does VPS stand for, and what is VPS used for in depth. By the end of this post, you’ll know if a VPS is going to be right for your needs. What Does VPS Stand For? As you learned in the introduction, VPS stands for Virtual Private Server. This kind of server environment operates similarly to a dedicated server. However, instead of a single physical server, you’re sharing multiple physical servers, which are linked together through virtualization technologies. You can think of VPS like a cross between a shared server and a dedicated server. You’re sharing a physical server with other website owners, but it’s divided up in a way so there are multiple virtual dedicated servers present, hence the “virtual” aspect of VPS. What is VPS Hosting? If you have asked yourself “What is VPS hosting?”, this section will provide you with an in-depth look at this hosting service. VPS hosting is a step up from basic shared hosting. When you’re just getting started online, shared hosting will probably be the form of hosting you start with. With shared hosting, you have a single physical server which is then divided up between multiple different user accounts. In this scenario, you’re splitting physical server resources, which helps to keep costs low. On a basic level, VPS hosting has a similar setup. When you sign up for VPS hosting you have a guaranteed amount of server resources allocated to you, but you’re still sharing space on a physical server with other users. There are many differences between VPS hosting and shared hosting due to the virtualization technologies employed in VPS hosting. Even though you might be sharing the same physical server there won’t be any overlap in resource use, and the other VPS accounts won’t affect your site in any way. Think of it as a single dedicated server that’s split over multiple physical server environments. VPS hosting is a great choice for website owners who have outgrown shared hosting, yet aren’t quite ready for the price tag or features offered by a dedicated server. You can easily migrate from shared hosting to VPS hosting while still staying within a reasonable price point. Pros of VPS Hosting For some website owners, VPS hosting will be a godsend. Acting as the intermediary between shared and dedicated hosting, VPS can provide you with a lot of benefits. Here are the most common reasons website owners decide to upgrade to VPS hosting. 1. High Level of Performance If you currently have a slow loading website, then you’re doing a disservice to your visitors and your website as a whole. If you’ve been utilizing shared hosting and have been noticing a drop in performance, then one of the first things you’ll notice is an improvement in your loading speeds and overall site performance. VPS hosting is equipped to handle higher traffic levels right out of the gate. Plus, you have the ability to scale your server resources if your needs expand over time. 2. Improved Overall Security When your site starts to grow in popularity there’s a chance you’ll start to experience more security threats. Even if you’ve done everything in your power to harden your site’s security you could still be experiencing issues. In this case, it’s time to upgrade your hosting. VPS hosting offers you very high levels of security. You’re not only completely protected from other sites using the same physical server, but you’ll be able to implement other security hardening protocols as well. 3. Great Value Pricing VPS hosting might not be in everyone’s budget, but it offers a great value for the resources you have access to. Essentially, you’re getting access to a dedicated server at the fraction of the cost. Plus, with VPS hosting you’ll be enabling higher levels of performance and elevating the security protocols surrounding your site. When compared to shared hosting you’re getting a serious upgrade in hosting quality without a massive jump in price. 4. Greater Server Access and Customization VPS web hosting will generally provide you with a greater level of server access, along with the ability to customize your server environment as you see fit. Some, like WordPress VPS hosting, will have certain restrictions for plugin use and overall configuration. However, others will operate more or less like a clean slate, allowing you to choose your operating system and build whatever configuration will supercharge your site the most. Keep in mind that some hosts will also offer managed VPS web hosting, which means that the majority of the technical tasks required to manage your server will be taken care of by their teams. This option will help to free up your time and ensure your server is always fully optimized according to your website’s specifications. Cons of VPS Hosting Even though VPS hosting seems pretty great it’s not the perfect fit for every kind of website owner. Here are some of the most common reasons people decide not to go with VPS hosting: 1. Prohibitive Pricing Even though VPS hosting is quite cost-effective, especially with all of the features, the pricing can still be steep for some website owners. If a basic shared hosting plan is stretching your budget, then VPS might not be the right option for you. VPS hosting does seem cheap when compared to the more expensive dedicated hosting plans. However, it’s still a pretty sizable step up from shared hosting. 2. Poor Resource Allocation With Low-Quality Hosts VPS hosting relies upon proper resource allocation. If you’re using a low-quality host, another site that’s on the same physical server may impact your site, or your site otherwise won’t be able to perform at the level you’ve grown used to. However, using a high-quality host should help you easily avoid either of these issues. What is VPS Used For? Beyond hosting a website, VPS servers have a myriad of other uses. Even if you’re currently happy with your existing hosting plan, you might want to check out VPS hosting for the other types of scenarios it provides. Here are the most common VPS use cases beyond your standard hosting plan: 1. Hosting Your Own Personal Server There’s a multitude of reasons to run your own server environment, outside of simply hosting your website. A VPS server gives you your own virtual playground for additional online activities. For example, maybe you want your own dedicated servers for games? For some people, the cost of a dedicated server might be prohibitive, but instead, you could run a VPS server to host smaller game matches or create your own custom game environment. Not every hosting company will allow you to run a gaming server via VPS, so make sure you read the terms and conditions, or contact support, before you decide to go this route.   2. Testing New Applications If you regularly deploy web applications or test out custom server setups, you’ll need your own server environment to test these things out. But, an entire dedicated server might be too expensive to warrant simple testing. In this case, a VPS will fit the bill perfectly. This will give you a playground to do whatever you wish without incurring high monthly costs. 3. Additional File Storage Sometimes, you want to create another backup of your files, but using cloud storage accounts can become expensive. If you want to create secure and easily accessible backups, then consider using a VPS server. Overall, this might end up being cheaper than a cloud hosting account, depending on the overall volume of the files you need stored. However, keep in mind that not every hosting provider will allow their VPS accounts to be used for pure file storage, so double check the terms and conditions before you move forward. VPS Hosting Showdown By now you understand what a VPS hosting solution is, and the other reasons you might want to deploy a Virtual Private Server. Now it’s time to see how VPS hosting compares to the other forms of hosting out there. For those thinking about upgrading their current hosting package, this section is for you. 1. VPS vs Shared Hosting We went into shared hosting a bit above, but it’s worth digging in a bit more detail. With shared hosting, you’re renting space on a physical server that’s being shared with multiple other users. The server is partitioned between users, but there is a chance that other sites on the same server could impact your site. With a VPS hosting solution you’re still sharing a physical server with other users. But, the underlying technology is much different. A VPS utilizes what’s known as a hypervisor. This ensures that you always have access to the guaranteed level of server resources as specified in your hosting plan. Shared hosting is a great place to start, but once you’ve run into its limits, VPS is a great next step. Plus, VPS hosting has the added benefit of being able to scale with your site. 2. VPS vs Dedicated Hosting Dedicated hosting is pretty simple. You’re renting an entire physical server that’s yours to do whatever you want. It’s one of the more expensive forms of hosting available, but it’ll provide you with very high levels of performance and security while offering you the ability to customize your server however you see fit. A VPS server vs. a dedicated server will behave differently, in that you have your own virtualized dedicated server to use how you see fit. However, you don’t have your own physical dedicated server, just a virtual one. If you have a very high traffic website, or require very high levels of security, then a dedicated server might be a better fit. However, keep in mind that you’ll need a larger budget when compared to VPS hosting. But, if you don’t have the budget for a dedicated host, then VPS hosting will suit you fine until it’s possible to upgrade. 3. VPS vs Cloud Hosting Cloud hosting is one of the newer forms of hosting on the block. Overall, cloud hosting is similar to VPS in that it uses virtualization technologies to create a server environment. However, when comparing cloud hosting vs. VPS hosting, there’s a network of servers that are grouped together to create a cloud server cluster. This setup provides you with very high levels of reliability and scalability. So, if your traffic levels swing up and down from month to month, then this style of hosting could be advantageous. VPS hosting operates in a similar fashion by creating a virtualized web server environment across a few physical servers (if your resource needs require it). However, with VPS hosting you should have a more stable volume of traffic per month, even if it’s rising on a consistent basis. In Closing: Do You Need to Use VPS? VPS hosting is a perfect fit for those who require the resources that a dedicated server can provide, but aren’t quite ready for a dedicated web server. When it comes to your website, using VPS hosting will offer you higher levels of performance, storage, and scalability if the need arises. However, you might also think about utilizing a VPS for deploying and testing projects, running your own personal server, or even for additional file storage or website backups. Whether or not you need to upgrade to a VPS depends on if you’ve currently hit the limits of your existing hosting package, or want to test out a VPS for any of the reasons highlighted above. Hopefully, you have a better understanding of what a VPS is used for, even beyond the realm of hosting your website. If you’ve currently hit the limits of your shared hosting account, then upgrading to VPS hosting can be a great decision for the future of your website. Find the post on the HostGator Blog

Cloudflare Repositories FTW

CloudFlare Blog -

This is a guest post by Jim “Elwood” O’Gorman, one of the maintainers of Kali Linux. Kali Linux is a Debian based GNU/Linux distribution popular amongst the security research communities.Kali Linux turned six years old this year!In this time, Kali has established itself as the de-facto standard open source penetration testing platform. On a quarterly basis, we release updated ISOs for multiple platforms, pre-configured virtual machines, Kali Docker, WSL, Azure, AWS images, tons of ARM devices, Kali NetHunter, and on and on and on. This has lead to Kali being trusted and relied on to always being there for both security professionals and enthusiasts alike.But that popularity has always led to one complication: How to get Kali to people?With so many different downloads plus the apt repository, we have to move a lot of data. To accomplish this, we have always relied on our network of first- and third-party mirrors.The way this works is, we run a master server that pushes out to a number of mirrors. We then pay to host a number of servers that are geographically dispersed and use them as our first-party mirrors. Then, a number of third parties donate storage and bandwidth to operate third-party mirrors, ensuring that we have even more systems that are geographically close to you. When you go to download, you hit a redirector that will send you to a mirror that is close to you, ideally allowing you to download your files quickly.This solution has always been pretty decent, however it has some drawbacks. First, our network of first-party mirrors is expensive. Second, some mirrors are not as good as others. Nothing is worse than trying to download Kali and getting sent to a slow mirror, where your download might drag on for hours. Third, we always always need more mirrors as Kali continues to grow in popularity.This situation led to us encountering Cloudflare thanks to some extremely generous outreach https://t.co/k6M5UZxhWF and we can chat more about your specific use case.— Justin (@xxdesmus) June 29, 2018 I will be honest, we are a bunch of security nerds, so we were a bit skeptical at first. We have some pretty unique needs, we use a lot of bandwidth, syncing an apt repository to a CDN is no small task, and well, we are paranoid. We have an average of 1,000,000 downloads a month on just our ISO images. Add in our apt repos and you are talking some serious, serious traffic. So how much help could we really expect from Cloudflare anyway? Were we really going to be able to put this to use, or would this just be a nice fancy front end to our website and nothing else?On the other hand, it was a chance to use something new and shiny, and it is an expensive product, so of course we dove right in to play with it.Initially we had some sync issues. A package repository is a mix of static data (binary and source packages) and dynamic data (package lists are updated every 6 hours). To make things worse, the cryptographic sealing of the metadata means that we need atomic updates of all the metadata (the signed top-level ‘Release’ file contains checksums of all the binary and source package lists).The default behavior of a CDN is not appropriate for this purpose as it caches all files for a certain amount of time after they have been fetched for the first time. This means that you could have different versions of various metadata files in the cache, resulting in invalid checksums errors returned by apt-get. So we had to implement a few tweaks to make it work and reap the full benefits of Cloudflare’s CDN network.First we added an “Expires” HTTP header to disable expiration of all files that will never change. Then we added another HTTP header to tag all metadata files so that we could manually purge those files from the CDN cache through an API call that we integrated at the end of the repository update procedure on our backend server.With nginx in our backend, the configuration looks like this:location /kali/dists/ { add_header Cache-Tag metadata,dists; } location /kali/project/trace/ { add_header Cache-Tag metadata,trace; expires 1h; } location /kali/pool/ { add_header Cache-Tag pool; location ~ \.(deb|udeb|dsc|changes|xz|gz|bz2)$ { expires max; } } The API call is a simple shell script launched by a hook of the repository mirroring script:#!/bin/sh curl -sS -X POST "https://api.cloudflare.com/client/v4/zones/xxxxxxxxxxx/purge_cache" \ -H "Content-Type:application/json" \ -H "X-Auth-Key:XXXXXXXXXXXXX" \ -H "X-Auth-Email:your-account@example.net" \ --data '{"tags":["metadata"]}' With this simple yet powerful feature, we ensure that the CDN cache always contains consistent versions of the metadata files. Going further, we might want to configure Prefetching so that Cloudflare downloads all the package lists as soon as a user downloads the top-level ‘Release’ file.In short, we were using this system in a way that was never intended, but it worked! This really reduced the load on our backend, as a single server could feed the entire CDN. Putting the files geographically close to users, allowing the classic apt dist-upgrade to occur much, much faster than ever before.A huge benefit, and was not really a lot of work to set up. Sevki Hasirci was there with us the entire time as we worked through this process, ensuring any questions we had were answered promptly. A great win.However, there was just one problem.Looking at our logs, while the apt repo was working perfectly, our image distribution was not so great. None of those images were getting cached, and our origin server was dying.Talking with Sevki, it turns out there were limits to how large of a file Cloudflare would cache. He upped our limit to the system capacity, but that still was not enough for how large some of our images are. At this point, we just assumed that was that--we could use this solution for the repo but for our image distribution it would not help. However, Sevki told us to wait a bit. He had a surprise in the works for us.After some development time, Cloudflare pushed out an update to address our issue, allowing us to cache very large files. With that in place, everything just worked with no additional tweaking. Even items like partial downloads for users using download accelerators worked just fine. Amazing!To show an example of what this translated into, let’s look at some graphs. Once the very large file support was added and we started to push out our images through Cloudflare, you could see that there is not a real increase in requests:However, looking at Bandwidth there is a clear increase:After it had been implemented for a while, we see a clear pattern.This pushed us from around 80 TB a week when we had just the repo, to now around 430TB a month when its repo and images. As you can imagine, that's an amazing bandwidth savings for an open source project such as ours.Performance is great, and with a cache hit rate of over 97% (amazingly high considering how often and frequently files in our repo changes), we could not be happier.So what’s next? That's the question we are asking ourselves. This solution has worked so well, we are looking at other ways to leverage it, and there are a lot of options. One thing is for sure, we are not done with this.Thanks to Cloudflare, Sevki, Justin, and Matthew for helping us along this path. It is fair to say this is the single largest contribution to Kali that we have received outside of the support by Offensive Security.Support we received from Cloudflare was amazing. The Kali project and community thanks you immensely every time they update their distribution or download an image.

Thinking Global Business? You need a .GLOBAL Domain name

BigRock Blog -

The Internet has brought the world closer. All you need is a website to start tapping the global market. But you need to choose the right domain name extension for attracting international visitors. And you will want to clarify a few apprehensions as well: Should I go with the “.com” domain? Can a ccTLD work well in different markets? If I chose a new generic TLD, will it rank the same way? When you try to find answers, you may even think of having separate ccTLDs for countries to cater country-specific market. But there is a better way to meet your global aspirations. Go with .global Domain Extension It is a generic top-level domain that came into existence in 2014. With it, you can reach out to the worldwide audience and portray your business as a global brand. Whether your business is of product or services, you can use the “.global” extension. Thousands of websites are already registered with this domain in a short period since its inception. Some compelling reasons to choose this extension are: Global Image When you have this generic top-level-domain with your domain name, for instance, www. domain-name.global, it instantly helps you portray a global branding message that you want to convey to your visitors. What if you had chosen “.com” or some other extension? You get the feeling that they can’t create the same effect. Global Traffic When you are doing business across the globe, and are dealing across the international borders, using this extension will help. Foreign visitors coming to the site can easily connect to the domain. You will surely not want them to ignore your website due to country restrictions. When the website is accessible to any user without any limitation of location, “.global” extension can help to attract traffic from visitors across the globe. Global SEO With this extension, search engine optimization for the site is improved for the better. Visitors looking for companies with a global presence may use the word “global” while searching on the internet. When you have “.global” domain extension, it will serve as a keyword as well which results in improved rank on the search results. Use Case For This Extension You are a multinational organization which has business interests in many countries across the globe. You want to differentiate the local website from the international one. Here is what you can do: Global Website: www.businessdomainname.global Country Sites: www.businessdomainname.in www.businessdomainname.sg www.businessdomainname.uk You can use the site with “.global” extension to cater to international users. Once they reach this site, they come across your global brand. However, if they are looking for something location-specific, you can guide them to country-specific site and connect with the local offices. With the use of this extension for the global site, you are showcasing your diversity which allows international site visitors to connect instantly.  The word ‘global’ is prevalent in most languages, so there are minimal linguistic issues in sending the branding message. Can You Register Your Site With It? You can easily register your site with no restrictions at all. Whether you are a global organization or an individual you can register with ease. There are thousands of sites which use this domain extension without any issues. There are minimal chances that you will find a domain name ‘unavailable’ with this extension. Leading Sites Using This Extension https://mobian.global/ –  It is a market place for mobility providers and mobility resellers. Here they can connect quickly. https://urc.global/ – This company has its presence in 90 countries. It provides healthcare, social services, and health education across the world. http://h2go.global – The company provides clean drinking water to remote and rural areas. More than 1.7 million people across the globe are using its products. Benefits of register a “.global” domain extension with BigRock Instant Global Brand Image for the domain name. Availability of domain is not an issue. Improved SEO. Helps attract global traffic. Search Now or Call @ 1800 266 7625

What Happens When There’s a Catastrophic Failure in Your Infrastructure

Liquid Web Official Blog -

The Horrific Reality of Catastrophic Failure The Exorcist doesn’t hold a candle to the idea of a catastrophic failure wiping out your data, your web presence… your entire operation (cue the vomit). It should scare you. Our livelihoods—our lives—are increasingly digital. Your IT infrastructure is integral to your operations. Whether it’s your website, your database, or your inter-office communications and operations, downtime is intolerable. A catastrophe-level shutdown is unfathomable. Fortunately, there are plenty of ways to safeguard your business from the worst. You can read about how to prevent a disaster with redundancies, a high availability (HA) infrastructure, and other solutions, here, here, and here. However, things happen and even the best-laid plans are well intended, but sometimes a tornado comes through a takes out your data center. In the event that something catastrophic does occur, you need to be ready and the best way to be ready is to understand exactly what happens if (and with the right protection that’s a pretty big if) the walls you’ve built around your business come tumbling down. You need to expect the unexpected, so you’re prepared for anything that comes your way.  Subscribe to the Liquid Web weekly newsletter to get the latest on high availability technology sent straight to your inbox. Failures Occur. When? There isn’t an infrastructure out there (no matter how well designed, implemented, or maintained) that is impervious to failure. They happen. That’s why HA systems are a thing; it’s why you have redundancies, backups, and other preventative measures. But, where do they occur? When do they occur? Well, there are 5 particularly vulnerable points in your infrastructure—housing, hardware, ISP, software, and data. Your first vulnerable point, housing, is your physical accommodations and include the building that houses your servers/computers, your climate controls, and your electrical supply. Your housing is only vulnerable in highly specific instances (natural disasters, brownouts, blackouts, etc.) and is pretty easily mitigated. For example, two separate sources of power, uninterruptible power supplies, battery backups, restricted access to server rooms, routine building maintenance, etc. can reliably safeguard this vulnerability in your infrastructure. This goes for your ISP (fiber, cable, wireless) and other vendors, as well. Thoroughly vetted, high-quality vendors will have their own HA systems in place, making this vulnerability in your infrastructure a low probability for catastrophic failure. However, your hardware, software, and data are significantly more vulnerable even though there are steps your company can take to prevent failures. Servers, computers, peripherals, and network equipment age, break down, and fail; it’s just the reality of physical systems. But, non-physical systems (productivity and communication software, websites, applications, etc.) are also open to certain failures, including external attacks—DDoS, hacking, bugs, viruses, and human error. Finally, your data can get corrupted by itself or can fail as a result of another failure in the chain; a hardware failure, for example, could wipe out your data. While some failures can be predicted and prevented—regular maintenance and replacement of equipment to prevent breakdowns, for example—others simply can’t be anticipated. A sudden equipment failure, power outages, natural disasters, a DDoS attack; these can all occur seemingly out of nowhere. You simply have to have a plan in place to react to these events in case they do (almost inevitably) happen. A good rule of thumb is to create an infrastructure that doesn’t have (or at least attempts to eliminate) a single point of failure. All of these vulnerability points—housing, hardware, ISP, software, and data—are susceptible to single points of failure. Housing? Make sure you have a physical space you can use in case the first space become unviable. Hardware? Make sure you have redundant equipment you can swap in, in case of a failure. ISP, software, data? Redundancies, backups, and backups of backups. Be prepared. What is the Worst Case Scenario? In 2007, according to the Los Angeles Times, “a malfunctioning network interface card on a single desktop computer in the Tom Bradley International Terminal at LAX” brought international air travel to an absolute standstill; for nine hours. For nine hours, 17,000 passengers were stranded on board—because this was software used by U.S. Customs, software used to authorize entry and exit, no one was allowed to disembark. This not only stopped international travel in its tracks, U.S. Customs and the airlines themselves had to supply food, water, and diapers to passengers, and had to keep refueling to keep the environmental controls on the aircraft operating. Oh, and shortly after the system was restored, again according to the Los Angeles Times, it gave out again: “The second outage was caused by a power supply failure.” Now that’s a worst case scenario. You’re not U.S. Customs or LAX, but you can relate. Almost nine hours of downtime in a single day exceeds what 81% of businesses said they could tolerate in a single year (thanks Information Technology and Intelligence Corp). Everyone’s worst case scenario is different, but a massive failure that cripples your infrastructure for even a few hours in a single day can have irrevocably adverse effects on your revenue, your workflow, and your relationship with your clients/customers. Any significant downtime should be a cause for concern. Is it a worst case scenario? Maybe not, but a few days in a row—or even over the course of a year—could be. Automatic Failover vs. No Automatic Failover While a systems failure is a spectrum of what can go wrong, there are two scenarios on either end—an automatic failover and a catastrophic failure in which a failover doesn’t take place either manually or automatically. Failover systems themselves can fail, but it’s more likely that there isn’t a system in place to automate a switch to a redundant system. What follows is a look into what actually happens during an automatic failover and what would happen if such a system wasn’t in place. What Happens During an Automatic Failover Several scenarios can trigger a failover—your secondary node(s) does/do not receive a heartbeat signal; a primary node experiences a hardware failure; a network interface fails; your HA monitor detects a significant dip in performance, or a failover command is manually sent. In the event that a secondary node does not receive a heartbeat signal (synchronous, two-way monitor of server operation and performance), there are several causes including network failure, a hardware failure, or a software crash/reboot. As you can see, an automatic failover is triggered (predominantly) by an equipment failure. Any time a piece of equip stops operating—or even begins to perform below its expected values—a failover will be triggered. It should be noted that there is a difference between a switchover and failover. A switchover is simply a role reversal of the primary and a secondary node; a secondary node is chosen to become the primary node and the primary node becomes a secondary node. This is almost always anticipated and done intentionally. A common switchover scenario is maintenance and upgrading. In a switchover, there is no data loss. A failover, on the other hand, is a role reversal of the primary node and a secondary node in the event of a systems failure (network, hardware, software, power, etc.). A failover may result in data loss depending on the safeguards in place. So, what does happen in an automatic failover? Let’s break it down: An event occurs that initiates failover. This could be a network failure, a power outage, a software failure, or a hardware failure. In all cases, the heartbeat link between the primary node and the elected secondary mode is severed and failover is initiated. An error log (why was a failover initiated?) is created. The elected secondary node takes on the role of the primary node. The primary node is removed from the cluster. What Happens With No Automatic Failover Ok, so you don’t have an automatic failover safeguard in place and something breaks—or, even worse, a lot of things break. What happens? Well, that’s going to depend on what systems you have in place. If you have working backups, but no automatic failover systems in place, you’ll retain your data. However, depending on your infrastructure, the amount of time it will take to recognize a failure and the amount of time it takes to manually switch over will be much longer than an automatic solution. However, if your system is sketchy and there are vulnerabilities throughout, things get significantly more complicated and need to be addressed on a case-by-case basis. We can, though, examine what happens in systems with one or more single points of failure at critical junctures. You’re sure to remember housing, hardware, ISP, software, and data. Housing. In May of 2011, a major tornado ripped through Joplin, MO. In the tornado’s path were a hospital and the hospital’s adjoining data center. The data center held both electronic and physical records. Serendipitously, the hospital IT staff was in the middle of mass digitization and data migration to an off-site central center with redundant satellites. Which meant that most of the data was saved (although some records were irrevocably destroyed) and the hospital was able to mobilize services quickly. However, if the tornado had come any earlier, the data loss would have been extreme. While this scenario (indeed, any IT housing disaster) is rare, it does happen and there are ways to safeguard your equipment and your data. According to Pergravis, (offsite backups notwithstanding) the best data center is constructed from reinforced concrete and is designed as a box—the data center—within a shell—the structure surrounding the data center—which creates a secondary barrier. This is, obviously, a pie-in-the-sky scenario, but Pegravis does offer simpler solutions for shoring up an existing data center. For example, they suggest locating your data center in the middle of your facility away from exterior walls. If that’s not an option, however, removing and sealing exterior windows will help safeguard your equipment from weather damage. Hardware. The key to any secure system (the key to HA, as we’ve discussed here and here) is redundancy. That includes redundant hardware that you might not immediately think of. A few years ago, Microsoft Azure Cloud services in Japan went down for an extended period of time because of a bad rotary uninterruptible power supply (RUPS). As the temperatures in the data center rose, equipment began shutting itself off in order to preserve data, disrupting cloud service in the Japan East region. It’s not always going to be a storage device that fails or even a network appliance. Besides, most systems are over-engineered in terms of server component, data backup, and network equipment redundancies. It’s up to you to work with your company to conceive of, prepare for, and shore up any weaknesses in your IT infrastructure—if you prepare for the worst, it will never come. ISP. According to the Uptime Institute, between 2016 and 2018, 27% of all data center outages were network-related. As more and more systems migrate to the cloud and more and more services are network-dependent, redundant network solutions are becoming increasingly important. In some cases, that could mean two or more providers or two or more kinds of services—fiber, cable, and wireless, as an example. Software. Whether it’s unintended consequences (Y2K) or a straight-up engineering faceplant—in 1998, NASA lost the Mars Polar Lander because a subcontractor used imperial units instead of metric like they were supposed to—software is vulnerable. When software goes bad, there’s usually a human to blame, and that’s true for cyber attacks, too; DDoS attacks other cyber intrusions are on the rise. According to IndustryWeek, in 2018 there was, “…a 350% increase in ransomware attacks, a 250% increase in spoofing or business email compromise (BEC) attacks and a 70% increase in spear-phishing attacks in companies overall.” What does this mean for you? It means defensive redundancies—threat detection, firewalls, encryptions, etc. It also means having a robust HA infrastructure in case you do come under attack. With an HA system with automatic failover, you can quickly take down the affected systems and bring up clean ones. Data. In 2015, a Google data center in Belgium was struck—multiple times in quick succession—by lightning. While most of the servers were unaffected, some users lost data. Data redundancy is the cornerstone of any HA infrastructure and new and improved options for data retention are constantly emerging. With the increase in virtual networks, virtual machines, and cloud computing, your company needs to consider both physical and virtual solutions—redundant physical servers, redundant virtual servers—in addition to multiple geographical locations. How the Right Protection Saves You As has been mentioned, it’s up to you and your company to examine and identify single points of failure—and other weak spots—in your infrastructure. A firm grasp of where vulnerabilities most often occur (housing, hardware, ISP, software, and data) will give you a better understanding of you own system’s limitations, flaws, and gaps. While you can’t prepare for (or predict) everything, you can eliminate single points of failure and shore up your IT environment. An HA system with plenty of redundancies, no single points of failure, and automatic failover, you’ll not only safeguard your revenue stream, you’ll maintain productivity, inter-office operations, keep staff on other tasks, and get better sleep at night (you know, from less anxiety about everything coming to a grinding halt). What We Offer at Liquid Web At Liquid Web, we worry about catastrophic failures (preventing them, primarily, but recovering from them too) so you don’t have to. To this end, we make automatic failovers—and cluster monitoring for the shortest and most seamless transitions—a top priority. Heartbeat, our multi-node monitor, and the industry standard, keeps a close eye on the health of your systems, automatically performing failovers when needed. Heartbeat can quickly and accurately identify critical failures and seamlessly transition to an elected secondary node. The automatic failover system in place at Liquid Web is one of many components that comprise our HA infrastructure and uptime guarantee. We offer 1000% compensation as outlined in our SLA’s 100% uptime guarantee. What does this mean? This means that if you experience downtime we will credit you at 10x the amount of time you were down. At Liquid Web, we also continue to operate at 99.999% (or five 9s), a gold standard for the industry—this equates to only 5.26 minutes of downtime a year, 25.9 seconds of downtime per month, and 6.05 seconds of downtime a week. Five 9s is incredibly efficient and we are proud to operate in that range. However, we are constantly striving for more efficiency, more uptime, and optimization. A Final Reminder: Failures Do Happen Failures do happen. If Google is susceptible to a catastrophic failure, everyone is susceptible to a catastrophic failure. You can, however, mitigate the frequency and severity of catastrophic failures with a thorough accounting of your infrastructure, a shoring up of your systems, a solid and sensible recovery plan, and plenty of redundancies. Oh, and don’t forget an automatic failover system; it will save you time (and data) when you have to transition from a failing primary node to a healthy secondary node. The post What Happens When There’s a Catastrophic Failure in Your Infrastructure appeared first on Liquid Web.

Eric Enge’s interview with Fabrice Canel

Stone Temple Consulting Blog -

Fabrice Canel is a Principal Program Manager at Bing, Microsoft where he is responsible for web crawling and indexing. Today’s post is the transcript of an interview in which I spoke with Fabrice. Over the 60 minutes we spent together we covered a lot of topics.   During our conversation, Fabrice shared how he and his team thinks about the value of APIs, crawling, selection, quality, relevancy, visual search, and the important role the SEO continues to play.  Eric: What’s behind this idea of letting people submit 10,000 URLs a day to Bing?  Fabrice: The thought process is that as our customers expect to find latest content published online, we try to get this content indexed seconds after the content is published. Getting content indexed fast, is particularly important for content like News. To achieve freshness, relying only on discovering new content via crawling existing web pages, crawling sitemaps and RSS feeds do not always work. Many sitemaps are updated only once a day, and RSS feeds may not provide full visibility on all changes done on web sites.  So instead of crawling and crawling again to see if content changed, the Bing Webmaster API allows to programmatically notify us of the latest URLs published on their site. We see this need not only for large websites but for small and medium websites who don’t have to wait for us to crawl it, and don’t like too many visits from our crawler Bingbot on their web sites.   Eric: It’s a bit like they’re pushing Sitemaps your way. And the code to do this is really very simple. Here is what that looks like:  You can use any of the below protocols to easily integrate the Submit URL API into your system.     Fabrice: Yes, we encourage both, pushing the latest URLs to Bing and having sitemaps to insure we are aware of all relevant URLs on the site. Pushing is great but the internet is not 100% reliable, sites goes down, your publishing system or our system may have temporary issues, sitemaps is the guaranty that we are aware of all relevant URLs on your site. In general, we aim to fetch sitemaps at least once a day, and when we can fetch more often most sites don’t want us to fetch them as often as every second. Complementary to freshness, RSS feeds is still a good solution for small and medium sites, but some sites are really big, and one RSS can’t handle more than 2500 URLs to keep its size within 1 MB. All of these things are complementary to tell us about site changes.  Eric: This means you will get lots of pages pushed your way that you might not have gotten to during crawling, so it should not only enable you to get more real time content, but you’ll be able to see some sites more deeply.  Fabrice: Absolutely, every day we discover more than 100 billion URLs that we have never seen before. What is even scarier, these are the URLs that we normalized– no session ids, parameters, etc. This is only for content that really matters and it’s still 100 billion new ones a day. A large percentage of these URLs are not worth indexing. Some simple examples of this include date archives within blogs or pages that are largely lacking in unique content of value. The Bing mechanism for submitting URLs in many cases is more useful and trustable than what Bingbot can discover through links.  Eric: For sites that are very large, I heard you make reference that you would allow them to form more direct relationships to submit more than 10,000 URLs per day.  Fabrice: You can contact us, and we’ll review & discuss it, and how it bears on business criteria of the sites. please don’t send us useless URLs, as duplicate content or duplicate URLs, so we won’t send fetchers to fetch that.  Eric: How will this change SEO? Will crawling still be important?  Fabrice: It’s still important to ensure that search engines can discover your content and links to that content. With URL submission you may have solved the problem of discovery, but understanding interlinking still matters for context.  Related to that is selection, SEOs should include links to your content and selection. The true size of the Internet is infinity, so no search engines can index all of it.  Some websites are really big, instead of adding URLs to your sites to get only few of the URLs indexed, it’s preferable to focus on ensuring the head and body of your URLs are indexed. Develop an audience, develop authority for your site to increase your chances of having your URLs selected. URL submission helps with discovery, but SEOs still need to pay attention to factors that impact selection, fetching, and content. Ultimately, your pages need to matter on the Internet.  Eric: So, even the discovery part, there is still a role for the SEO to play, even though the API makes it easier to manage on your end.  Fabrice: Yes, for the discovery part there’s a role for the SEO to remove the noise and guide us to the latest content. LESS IS MORE. The basics of content structure still matter too.  For example, you:  still need titles/headers/content still need depth and breadth of content still need readable pages still need to be concerned about site architecture and internal linking  Eric: On the AI side of things, one of the things I think we’re seeing is an increasing push towards proactively delivering what people want before they specifically request it– less about search, and more about knowing preferences & needs of users, serving up things to them real-time, even before they think to do a search. Can you discuss that a little bit?  Fabrice: You might think of this as position “-1”, this is not only to provide results, but to provide content that may satisfy needs of the people, information that is related to you and your interests, within the Bing app or Bing Home page. You can set your own interest via the Bing settings and then you will see the latest content on your interest in various canvas. I am deeply interested in knowing the latest news quantum computing… what’s your interests?  Instead of searching for the latest every five minutes, preferable to be notified about what’s happening in more proactive ways.  Eric: So Bing, or Cortana, becomes a destination in of itself, and rather than searching you’re getting proactive delivery of content, which changes the use case.  Fabrice: Yes. We prefer surfacing the content people are searching for based on their personal interests. To be the provider of that content, to have a chance to be picked up by search engines, you have to create the right content and establish the skill and authority of that content. You must do the right things SEO-wise and amplify the authority of your site above other sites.   Eric: There’s always the issue of authority, you can make great content, but if aren’t sharing or linking to your content, it probably has little value.  Fabrice: Yes, these things still matter. How your content is perceived on the web is a signal that helps us establish the value of that content.  Eric: Let’s switch the topic to visual search and discuss use cases for visual search.  Fabrice: I use it a lot, and shopping is a beautiful example of visual search in action. For example, take a picture of your chair with your mobile device, upload the image to the Bing Apps and bingo you have chairs that are matching this model. The image is of a chair, it’s black, and the App will find similar things that are matching.  Visual search involves everything related to shopping, day to day object recognition, people recognition, and extracting information that is matching what your camera was capturing.  Eric: For example, I want to know what kind of tree that is …  Fabrice: Trees, flowers, everything  Eric: How much of this kind of visual search do you anticipate happening? I’d guess it’s currently small.  Fabrice: Well, yes, and no. We use this technology already in Bing for search and image search– understanding images we are viewing on the Internet– images with no caption or no alt text relating to the image, if we are able to recognize the shapes in the image, people may put in text keywords, the image may have additional meaning, extracting information that can advance the relevance of a web page.  Going beyond Bing and search, this capability is offered in Azure and articulated in all kinds of systems across the industry, this is offering enterprises the ability to recognize images, also camera inputs, and more. This can also extend into movies.  Eric: You mentioned the role images can play in further establishing the relevance of a web page. Can visual elements play a role in assessing a page’s quality as well?  Fabrice: Yes, for example you can have a page on the Internet with text content, and within it you may have an image that is offensive in different ways. The content of the text is totally okay, but the image is offensive for whatever reason. We must detect that and treat it appropriately.  Eric: I’d imagine there are scenarios where the presence of an image is a positive quality identifier, people like content with images after all.  Fabrice: Yes, images can make consuming the content of a page more enjoyable. I think in the end it’s all about the SEO, you need to have good text, good schema, and good images, Users would love to go back to your site if it’s not full of ads, and not too much text with nothing to illustrate. If you have a bad website with junky HTML people may not come back. They may prefer another site with preferable content.  Eric: Integration of searching across office networks is one of the more intriguing things we’ve heard from Bing, including the integration with Microsoft Office documents. As a result, you can search Office files and other types of content on corporate networks.  Fabrice: When you search with Bing and you are signed up to a Microsoft/Office 365 offering enabling Bing for business, Bing will also search your company data, people, documents, sites and locations, as well as public web results, and surface this search results in a unified search results experience with internet links. People don’t have to search in two three places to find stuff. Bing offers a one-click experience, where you can search your Intranet, SharePoint sites for the enterprise, and the Internet all at once. You can have an internal memo that comes up in a search as well as other information that we find online. We offer you a global view. As an employee, this is tremendously helpful to do more by easing finding the information.  Need to find the latest vacation policy for your company? We can help you find it. Need to know where someone is sitting in your office? We can help you find that too. Or, informational searches that we do can seamlessly find documents both online and offline.  Eric: Back to the machine learning topic for a moment – are we at the point today where the algorithm is obscure enough it is not possible for a single human to describe the specifics of ranking factors.  Fabrice: In 15 minutes it can’t be effectively done. We are guided through decisions we are taking in terms of quality expectations and determining good results vs. not so good results. Machine learning is far more complicated, when we have issues, we can break it down, find out what is happening per search. But it’s not made up of simple “if-then” coding structures, it’s far more complicated  Eric: People get confused when they hear about AI and machine learning and they think that it will fundamentally change everything in search. But the reality is that search engines will still want quality content, and need determine its relevance and quality.   Machine learning may be better at this, but as publishers, our goal is still to create content that is very high quality, relevant, and to promote that content to give it high visibility. That really doesn’t change, it doesn’t matter whether you’re using AI / machine learning or a human generated algorithm.  Fabrice: That will never change. SEO is like accessibility where you need common rules to make things accessible for people with disabilities. In the process of implementing SEO you’re helping search engines understand the thing, you need to follow the basic rules, you can’t expect search engines to do magic and adapt to each and every complex case.  Eric: There’s an idea that people have that machine learning might bring in whole new ranking factors that have never been seen before. But it’s not really going to change things that much is it?  Fabrice: Yes, a good article is still a good article.  Eric: A couple of quick questions to finish. John Mueller of Google tweeted recently that they don’t use prev/next anymore. Does Bing use it?  Fabrice: We are looking at it for links & discovery, and we use it for clustering, but it is a loose signal. One thing related to AI, at Bing we look at everything, this isn’t a simple “if-then” thing, everything on the page is a hint of some sort. Our code is looking at each and every character on each and every page. The only thing that isn’t a hint is robots.txt and meta noindex (which are directives), and everything else is a hint.  About Fabrice Canel  Fabrice is 20 years search veteran at Bing, Microsoft. Fabrice is a Principal Program Manager leading the team crawling, processing and indexing at Bing, so dealing with the hundreds of billions of new or updated web pages every day! In 2006, Fabrice joined the MSN Search Beta project and since this day, Fabrice is driving evolution of the Bing platform to insure the Bing index is fresh and comprehensive and he is responsible for the protocols and standards for Sitemaps.org and AMP on Bing. Prior to that MSN Search, Fabrice was the Lead Program Manager for search across Microsoft Web sites in a role covering all aspect of search from Search Engines technology to Search User Experience… to content in the very early days of SEO.

Local Development in WordPress [Webinar]

WP Engine -

A local development environment enables you to build and test your web development projects right on your computer, which gives you the freedom to focus on coding first without the fear of a problematic deployment. Local dev environments are ideal if you’ve been tasked with building a website or app for someone else or if… The post Local Development in WordPress [Webinar] appeared first on WP Engine.

Com vs Net: Which Should You Choose?

The Domain.com Blog -

There are two basic components within a website address. First, there’s the domain name, it’s what connects the website to a company or individual. It usually contains the name of the business, or speaks to what the business offers, or both. Then, there’s the domain name extension, it identifies what kind of website it is. There are over a thousand domain extensions although these are the most common: .com.net.org.edu.gov The two most frequently used domain extensions (.com and .net) are used by individuals and businesses who are trying to expand their reach online. Having a website allows you to buy and sell products online, offer research into a specific topic, and to spread a captivating message. So with both .com and .net being so common, which domain extension should you use? It all starts with a great domain. Get yours at Domain.com. 3 Factors to Consider When Choosing the Right Domain Extension Whether you’re a for-profit business, a blogger, or a conspiracy theory debunker, the right domain extension sets the proper expectation for users accessing your site. Imagine trying to purchase shoes online and seeing that the domain extension is a .org. One might make the logical leap that purchasing these shoes is in some way benefiting a nonprofit (as most nonprofits and charities will use the extension .org). While at first, this sounds great — even more reason to buy those shoes! — some might consider that a dishonest use of a domain extension. (Not that there are many requirements as to which TLD (or, top-level domain) businesses can use, but there are certain expectations and connotations for each one.) To properly utilize the .com or .net domain extension, consider these three factors. What is the Purpose of the Website Are you selling a product? Are you offering information? Are you trying to save a species of animals? These are important questions because they strike at the heart of your business and determine which domain extension is appropriate. Here is a breakdown of the most common domain extensions: .com – Usually offers a product or service. “Com” is short for “Commercial.” Commercial businesses, for-profit companies, personal blogs, and non-personal blogs are all standard for owning a .com domain. That being said, because of its generality, almost any website is acceptable as a .com. .net – Stands for “Network,” and is generally associated with “umbrella” sites — sites that are home to a wide range of smaller websites. Network sites were initially created for services like internet providers, emailing services, and internet infrastructure. If a business’s desired .com domain name is taken, .net can be considered an alternative. Other commonly used domain extensions have a more specific purpose: .org – Short for “Organization.” These sites are generally associated with nonprofits, charities, and other informational organizations that are trying to drive traffic not for commercial purposes. Other organizations who use .org are sports teams, religious groups, and community organizations..edu – or “Education.” Schools, universities, and other educational sites will utilize the .edu domain extension for an air of authority in the education space..gov – or “Government.” These sites are required to be part of the U.S. Government. Anything related to U.S. government programs or other departments must have a .gov domain extension. How Common is Your Business Name Imagine: A business offers standard products like sewing equipment and materials. The name of the company is something equally familiar like Incredible Sewing. Because “incredible” and “sewing” are two commonly used words, the chances that the appropriate domain is available for a .com domain extension is much less than for .net. (Although as of writing this, Incredible Sewing is available in the domain space.) The reason for this is how frequently each domain extension is used. In 2018, upwards of 46% of all registered domains used the .com TLD while only 3.7% used .net. When trying to come up with the perfect web address, sometimes it feels like every one-word or two-word .com domain name is already taken. This is one reason why some individuals and businesses will choose to use a .net extension versus a .com. (Note: It might be beneficial to check if your desired domain is available before moving forward with a project or company. Going to great lengths to plan in the beginning will save time and prevent you from having to remake those business cards due to an unavailable domain name! If you’re wondering how to search for your domain, check out Domain.com.) Memorability: Com vs Net Has this ever happened to you: An advertisement is playing, and you barely catch the tailwind of it? You type in the website address only to have it come up blank. Later, you find out you had put in .com when it was a .net, or some other, domain. The fact is, the basic assumption about websites is that they all have a .com domain extension. This is because the second most common top-level domain is only used about 5% of the time (.org). By going with the tried and true .com, companies can ditch this confusion and not worry about decreased traffic. If this seems absurd, consider this: Most cell phone keyboards now come with a “.com” button, though none come with .net, .org or any other domain extensions attached to it. Other Considerations for Creating a Web Address While both .com and .net are resourceful domains, there are other considerations to think about when creating a web address. Some of those center around: Traditional vs nontraditional domainsDomain protectionSEO: how each performs Traditional vs Nontraditional Domains For most businesses, straddling the traditional and nontraditional is part of the balancing act. While companies want to seem edgy and unique, unconventional ways can be viewed negatively by more traditional businesses and customers. In the web domain space, there are now over a thousand domain extensions available to the consumer. All but a handful are looked at as “nontraditional.” So, while it might seem valuable to stand out, be sure to consider how it may be viewed professionally. New TLDs Back in 2012, ICANN decided to allow businesses to apply for unique domain extensions. This quickly rose the number of TLDs from its original 22. Some of the early applications for domain extensions involved words such as: .design.lol.love.book.tech Some of these new TLDS offered immediate value to businesses and consumers who wanted a new and noteworthy domain. Others seemed more like gag websites (hence the stereotype of new TLDs being unprofessional). Either way, these new TLDs have exploded to a comprehensive list. Now, if you’re a yoga company, you can use .yoga. Sell yachts? Make tech? Play tennis? Eat soy? These are all available as domain extensions. Which means not only can you create more unique web addresses, but you can also be more specific. If having a new TLD sounds perfect for your business, be sure to check through the full list to find one that fits your needs. Domain Protection Depending on what you want to accomplish with your business website, it might be worth registering both .com and .net. In this way, you can protect yourself from competing companies taking a very similar domain. Otherwise, another company can ride off your success and potentially drive traffic away. As companies grow, they become more susceptible to being confronted with these sorts of schemes. They are then forced to decide whether to buy out the competing website or to let them be. Needless to say, the larger the company, the more they’re going to have to pay. What are other things you should look out for when it comes to people using similar domain names? Typosquatting Typosquatting is when individuals purchase web domains based on common misspellings of words. From our last example of Incredible Sewing, they might take the web domain by spelling “incredible” as “incredibel.” By systematically using misspellings, these forms of leaching can drive substantial traffic away from the intended website. These typosquatters can then offer to be bought out, or they’ll just continue to steer traffic to other organizations that they own. As of right now, the most viable option for protecting yourself is to purchase multiple domains. Although, this is becoming more difficult with each new TLD. SEO: How Each Performs Search engine optimization has to do with complex algorithms that determine how relevant your website is to a given search. In terms of which domain extension you should pick (between com vs net), there is no evidence that suggests one does better over another. It can be noted, however, that having certain keywords within your web domain can improve your SEO ranking. Having “sewing” within your domain will make your site more relevant for keyword searches around sewing. It’s that straightforward. Com: Pros and Cons As an overview, let’s run through the benefits and pitfalls of using a .com domain extension: Pros – Using a “commercial” extension, companies and individuals can signify their intention. Whether that’s to sell a product or service or to promote your work, the .com does this in a matter that’s professional and can be trusted. Also, there’s no worrying over your web address being confused. Cons – Because nearly half of all websites are based on .com, finding the perfect domain name that isn’t already in use can be tough. It can be pricey to buy out an existing domain and time-consuming to find one available. Net: Pros and Cons Originally designed for any network organization like internet providers and email sites, .net sites have been rising in popularity as an alternative to .com. Pros – Many fewer .net website domains have been registered than .com domains. This means there’s a higher chance of getting your ideal web domain. Also, because of its original design, .net sites are often associated with having a community around them. This can promote a positive image.Cons – These websites will need to market harder to compete with a similar .com site. Automatically, people think that any website is a .com site, which means businesses can lose traffic due to confusion. It all starts with a great domain. Get yours at Domain.com. How to Create the Perfect Domain Name Once you’ve decided whether you’re going with a .com or a .net domain extension, it’s then important to make sure it’s paired with the perfect domain name. The ideal address will do one of three things: State your businessState what your business doesIncite intrigue The first two are preferred, while the third is more of a backup strategy. Because many .com and .net sites have already been taken, sometimes a roundabout domain will be the best solution. A domain name should also have a few decisive characteristics. Try creating a web address that is some combination of: ClearConciseUnmistakable Short Straightforward Approach The first step is always to check if the business name is available as a domain. If your business name has been taken, check to see how up-to-date the website is. If it’s not current and doesn’t look like it’s being used, it might be possible to purchase the domain name from whomever owns it. Having the business name as the domain name is ideal because it’s the logical extension of that business. Starbucks has Starbucks.com. Apple has apple.com. If the business name is unavailable, sometimes it helps to add a modifier word. If starbucks.com was already taken, the next logical domain would be starbuckscoffee.com. In the same way, Apple would be able to use appleelectronics.com. It’s not as short as only having the business name, but it is still clear, concise, and unmistakable. Branding a Unique Term Another idea for getting the perfect website domain is to coin a term that’s unique to your business. Then you can use that term within your brand’s website. By doing this, you not only have crafted a unique web identity, but it can also be concise and short. Conclusion When determining which domain extension is better, com vs net, always be sure to look inward first. Acknowledge the purpose of putting your content online. Whether it’s to market a brand, sell a product, or connect various smaller sites by theme, each domain extension has its proper setting. By crafting the perfect domain name with the suitable domain extension, you can have a web address that is memorable, unique, and fitting for your business. More Information To find out more about the differences between new TLDs and gTLDs check out our domain blog today! There you’ll find other resources like How to Block an IP Address, How to Design a Website, and more. The post Com vs Net: Which Should You Choose? appeared first on Domain.com | Blog.

What Are Live-State Snapshots with My VPS Plan

InMotion Hosting Blog -

If you have VPS hosting, then you probably already know how important the safety and back-up features are for keeping your website secure. One of the more recent additions to this is the ability to take a snapshot of your VPS. Many different companies offer their own snapshot products. Essentially, they offer the same features. The software will take a single backup of the VPS and give you the option to restore back to the original save point at any time. Continue reading What Are Live-State Snapshots with My VPS Plan at The Official InMotion Hosting Blog.

Meet The British History Podcast: “History, the Way It’s Meant to Be Heard”

DreamHost Blog -

Think you hate history? You’re probably wrong, says Jamie Jeffers, founder of The British History Podcast. The problem, he says, isn’t that history is dry or boring — the problem is that it is taught that way, with rote memorization and little relevance to the modern world. “People are people,” Jeffers says. The stories of history, even ancient history, “are relevant and compelling on their own. They are only made irrelevant by poor storytellers who forget that simple truth — that history is the story of humanity. It’s about all of us.” With his podcast, which has been in production for almost a decade and has cultivated a loyal fan base over hundreds of episodes, Jeffers tells the stories of British history by tapping into that humanity. In his chronological retelling, you won’t hear lists of names, treaties, and battles, but rather tales of the cultural underpinnings behind the actions of kings and the day-to-day lives of the people of Britain. In Jeffer’s words, it was a happy convergence of “transatlantic immigration, global financial collapse, and ancient human traditions” that took him from unemployed lawyer to full-time podcaster creating the ultimate passion project, one that draws on his own personal history, builds his future, and connects us all to the past. Related: Step-by-Step Guide: How to Start a Podcast With WordPress History Through Storytelling It’s all his grandfather’s fault, Jeffers says. Jeffers, who moved to the US from the UK when he was a kid, learned the history of his homeland from his grandfather, who wanted to make sure young Jeffers heard stories of his ancestors alongside his American education. “He took it upon himself to teach me what he knew about British History as I was growing up,” Jeffers says. “He was an amazing storyteller, and so my first experience with history was through hearing about amazing events and figures. It was learning history as people traditionally taught it, as an oral history.” His grandfather’s storytelling taught Jeffers to love history — at least until he actually studied the subject in school. “I went to high school, and history was suddenly reduced to memorizing dates and names for a test,” Jeffers says. “No context, no nuance, no wonder at our shared past. It was such a disappointing experience that I lost interest in the study of history.” Eventually, Jeffers went on to study English in college and then become a lawyer. For the most part, Jeffers tabled his interest in history — that is, until the recession forced it back into his life. Global Financial Collapse The 2008 financial crisis wasn’t kind to most people — Jeffers included. As money got tight, he looked around for cheap sources of entertainment, leading him straight into the world of podcasting. “The first show I found was The Memory Palace, which is still going, and it became a regular companion when I was at the gym or taking my dog, Kerouac, for a walk. The host, Nate DiMeo, couldn’t have known it, but the way he talked about little odd stories from history made me feel like I was reconnecting with part of my childhood.” But a search for podcasts about British history was disappointing, to say the least. It brought him to a “show that was done by a guy who seemed to be reading random entries off Wikipedia. Incorrect entries, for that matter.” Back in those pre-Serial days, podcasting was a new thing — it was “pretty punk rock,” he says. “Few people knew about it, and even fewer people did it, which meant that many topics weren’t being covered and those that were weren’t being covered well. Quality was definitely a problem.” Jeffers did find a history show or two but occasionally found himself wishing for a good podcast that took on a chronological history of Britain. Then one day the financial collapse hit closer to home, and Jeffers lost his job as an attorney. “The part that people rarely talk about with unemployment is how boring it is,” he says. “So I decided that any time that I wasn’t job searching would go towards making that show I always wanted.” The podcast launched with its first season in May 2011, beginning with the Ice Age and prehistoric Britannia and moving into the Roman conquest of Britain. At first, Jeffers’ vision was nothing more than a fun hobby that only his parents would listen to. “Eight years later, it’s my life’s work,” Jeffers says. “Oh, and my parents still don’t listen to it. But a lot of other people do.” Today, the podcast boasts more than 3,000 reviews on iTunes and shows up on lists such as recommended podcasts for fans of Serial and Parade’s list of top history podcasts. Beyond the Battle Search your favorite podcasting app these days, and you’ll find history shows aplenty. But the British History Podcast (BHP) isn’t your run-of-the-mill history podcast, Jeffers says. “Many history podcasts are dry accounts that only perk up when they can talk about men swinging swords. They skip over the culture of the time, other than as it pertains to kings and generals, and then give you incredibly granular details of men killing other men in battle.” What interests Jeffers (and his audience) are the stories behind the conflicts. To truly understand and care about an action-packed battle, audiences need to appreciate the stakes. “There’s a reason why The Phantom Menace sucked, and it wasn’t the fight choreography,” he says. “Context is king, and that’s where our focus is.” That’s why the BHP discusses at length through the political, social, and cultural realities that drive the “action scenes” of history. Another way the show’s different: “We talk about women. It’s strange how often they’re written out.” Jeffers cites one of his favorite little-known figures from history: Lady Æthelflaed of Mercia, who reigned in an era when women we so overlooked, even vilified, that there weren’t any queens — just women known as “the king’s wife.” “And then you have the noble daughter of Alfred the Great, a woman named Æthelflaed, who ruled Mercia on her own after her husband died. She led armies. She fought off a massive force of Vikings at Chester by throwing everything, up to and including the town’s beehives, at them. This woman was so influential that after she died, even though the culture was deeply misogynistic, the Mercians chose to follow her daughter.” Jeffers’ favorite era of British history is the Middle Ages — “which I’m sure most of our listeners already know since we’ve spent about seven years in them so far.” The BHP is currently detailing the reign of King Æthelred Unræd (aka King Ethelred the Unready), who is often blamed for the downfall of the Anglo Saxons — “though I think there’s plenty of blame to go around.” Jeffers is most looking forward to covering the 15th-century Wars of the Roses, a series of English civil wars: “the diaries we have out of that era are stunning and show the real human toll that this conflict was taking on the population.” The planned finish line is the dawn of WWII, which could take another decade to reach. Until then, Jeffers is dedicated to dissecting and retelling as many stories and cultural tidbits as he finds relevant — a quest that fits nicely in the podcasting sphere. “Can you imagine The History Channel allowing me to do over 300 episodes of British History and spend literally hours just talking about how food was handled in the middle ages? Part of what makes podcasting so amazing is that it allows for niche shows like the BHP to exist.” Want to meet more awesome site owners?Subscribe to the DreamHost Digest for inside scoops, expert tips, and exclusive deals.Sign Me Up Behind the Scenes Jeffers is quick to remind his audience that he isn’t a professional historian, though his “magpie approach to education” serves him well as a “history communicator.” “My educational background has a common throughline of narrative building and research,” Jeffers says. “I studied storytelling in college, getting a degree in creative writing while also spending a lot of time taking courses in subjects like critical and cultural theory. As for law, my focus was as a litigator. What many people don’t realize about litigators is that a lot of what you do is tell stories to the judge or jury. You do a lot of deep research and then turn it into an easy to digest narrative for why your side should win. Turns out that these skills serve very well for teaching history — especially little-known history.” Each 25- to 40-minute episode takes about 40 to 50 hours to produce. As for structuring the stories, Jeffers rarely finds a clear “pop history narrative” to build around because the history of medieval Britain he aims to create simply doesn’t exist elsewhere. Instead, he digs through secondary sources, fact checks primary sources, scans and fact checks scholarly articles for alternative theories, and then looks into “any rabbit holes that pop up during the research.” The lengthy editing process is a collaboration between Jeffers and his partner and co-producer Meagan Zurn — or Zee, as she prefers. “Then I finally record the episode, do sound editing, and launch. It’s quite a process.” There’s no way Jeffers could juggle a full-time job with all of the research and planning involved. But thanks to a dedicated community of listeners, the podcast moved from passion project to day job. He doesn’t even need to run ads — it’s funded entirely through donations and a membership, which grants paying listeners access to exclusive content. “I’ve really lucked out in the community that has developed around the podcast,” Jeffers says. In fact, he says his favorite part of producing the podcast is connecting and collaborating with the community. “They’re really supportive and enthusiastic people.” The British History Podcast official web page, complete with membership content and a full archive of eight years worth of podcasting, is proudly hosted by DreamHost. Like the podcast itself, the website has been a DIY project: “When you’re a small project like this, anything you can do yourself, you do.” The site uses DreamPress Pro with Cloudflare Plus, “which has allowed us to have a rather stable user experience even during high load times like on launch days,” Jeffers says. “The tech support team has been really helpful in finding solutions to some of the more thorny problems of running a podcast site with a membership component.” Do What You Love with DreamPressDreamPress' automatic updates and strong security defenses take server management off your hands so you can focus on creating great content.Check Out Plans A Romance for the Ages Jeffers says he’s met some incredible people through the podcast community, including his producer — and now wife — Zee. In addition to co-producing the BHP, Jamie (left) and Zee (right) are partnering up for a new venture: parenting. Back in the early days of the BHP, Jeffers used an “old clunky Frankenstein computer that kept breaking down. I had a hard drive crash, a power supply short, a motherboard fry. I swear that damn computer had gremlins, and as a result, I repeatedly had to go on our community page and apologize for episodes getting delayed.” The community ganged up and insisted his problems stemmed from using a PC — all except one person, who stood her ground against the Mac fans. “I believe her exact phrase was, ‘You’re all caught up in a marketing gimmick,’” Jeffers says. A few months later, when he had an idea for a side project and wanted honest feedback, he remembered this listener’s well-researched uncompromising arguments. “And half a world away, in Southern England, Zee got a message out of the blue,” Jeffers says. “It ended up being the smartest thing I’ve ever done. The person I reached out to was a Ph.D. candidate in sociology and media with a background in anthropology and archaeology. She understood on an intrinsic level the ethics of the show, the long-term strategy, the purpose of it, and what it could be going forward.” And just like that, Jeffers had a collaborator: “One day, I was doing the show entirely on my own; the next day I was running all my ideas by her, and I structured my life so that I could work with her.” They discussed the show daily; Zee reviewed Jeffers’ scripts and prompted heated debates over the content. “And through that, the show dramatically improved in tone and style. She also became my best friend. Truth be told, I think she was my best friend from the first time we talked.” “Much later, we met in person, and it was clear my ferociously intelligent best friend was also really attractive. Eventually, we started dating. Then she proposed to me one Christmas morning, and now we’re expecting our son this July.” By the way, Jeffers still uses a PC. Looking Forward Overall, creating the podcast has been a rewarding creative outlet for both Jeffers and Zee — but the work can be draining. “It’s very satisfying but very intensive work to hit the quality we demand of ourselves.” For now, outside the show, his and Zee’s primary focus is preparing for parenthood. The podcast is likewise approaching a monumental milestone: the Norman Conquest of 1066. “This invasion changed everything, and it’s going to usher in a whole new era of the podcast as well,” Jeffers says. “We have a whole new culture to talk about, along with larger-than-life characters to introduce. The story is about to get a whole lot bigger.” What’s your next great idea? Tell the world (wide web) about it with DreamHost’s Managed WordPress Hosting, built to bring your dream to life without breaking the bank — or making any compromises in quality. The post Meet The British History Podcast: “History, the Way It’s Meant to Be Heard” appeared first on Website Guides, Tips and Knowledge.

What Is ASP.NET Hosting?

HostGator Blog -

The post What Is ASP.NET Hosting? appeared first on HostGator Blog. One of the most important decisions every website owner must make is choosing the right type of web hosting services. And there are a lot of different types of hosting plans out there. Selecting the best web hosting solutions for your website depends on a number of different factors, including the programs you use to build and maintain your website. For a certain subset of website owners, that makes considering ASP.NET web hosting services an important part of the process of finding the best plan for you. Before we can provide a good explanation of what ASP.NET web hosting is and who it’s right for, we need to define what ASP.NET is. What Is ASP.NET? ASP.NET is an open source framework programmers can use to build dynamic websites, apps, games, and online services with the .NET platform. In ASP.NET, programmers build web forms that become the building blocks of the larger website or app they work to create. While ASP.NET is not as commonly used as PHP—the most ubiquitous of the programming languages used to build websites—it provides some distinct benefits for web designers that make it a strong choice for many websites. 10 Pros of Using ASP.NET ASP.NET isn’t for everybody, which is why it has a much smaller market share than PHP. But the pros of using ASP.NET to build your website or app are notable enough to make it well worth consideration. Here are ten top reasons to consider using ASP.NET. 1. It’s open source. As an open-source framework, any developer or programmer can make changes to the ASP.NET architecture to make it work the way they need. And often developers will share any updates or improvements they make with the larger community, so you can benefit from the work being done by a wide number of talented, skilled ASP.NET programmers. Any open source piece of software or program gets the benefit of all the great minds that use it. Every programmer that sees a way to make it more flexible, secure, or feature-rich can contribute to it. With over 60,000 active contributors, you can count on ASP.NET to just keep getting better. 2. It’s known for being high speed. ASP.NET makes it easier to build a site while using less code than other programming options. With less code to process, websites and apps load faster and more efficiently. ASP.NET packages also uses compiled code rather than interpreted code. Compiled code is translated into object code once, then executed. And every time after that, it loads faster. In contrast, interpreted code has to be read and interpreted every time a user accesses it, which slows things down. While you always have options for speeding up your website, no matter what you build it with, ASP.NET means you’re starting off with a website that will work and load that much faster than with other options you could choose. 3. It’s low cost. In addition to being open source, ASP.NET is also free. You can download the latest version of the software from the website for nothing. You can write ASP.NET code in any simple text editor, including free options like Microsoft’s Visual Studio application. In some cases, as with Visual Studio, the most useful text editors have a free basic plan you can use to start, and paid versions that provide more useful features for the common needs of big businesses, such as collaboration options. You may end up spending some money to get the full use of it you need, but businesses on a budget have the option of using ASP.NET for free. 4. It’s relatively easy to use. While PHP has a reputation for being easier to use, ASP.NET also has many features that make it intuitive for programmers or reduce the amount of work required to create a website or app. For one thing, programming with ASP.NET requires creating less code than most other options. That both means less time spent working on code for developers, and that your pages will load faster because it takes less time to process the code that’s there.   For another, it offers code behind mode, which separates the design and the code. This creates separate files for the design part of a page, and the code part of a page. That makes it easier to test things out and make changes as you go without messing anything up. Finally, ASP.NET allows for template-based page development and server-side caching, both of which mean you can make the design elements you build go further and easily re-use them for different parts of the website or application. While ASP.NET is primarily a resource for professional developers rather than beginners, they have a range of free resources available for those who want to learn the ropes. 5. It has a large developer community. Even though ASP.NET is relatively easy to use, many website owners will want to hire a professional developer to help with the particulars of building out a website or app. Luckily, the ASP.NET community is big enough that finding a skilled developer to hire who has experience in using the framework shouldn’t be a problem in most cases. And having a large community also means that, as an open source software, there are more smart minds working to improve upon ASP.NET on a regular basis. Many of the issues it had in the past have been fixed, and anything about it you don’t like today may well be taken care of in the months or years to come. 6. It requires less setup for Windows users. If your business already uses Windows products, then picking a Windows framework to build your website or app on will make the overall process easier on your team. Since it’s made by Windows, ASP.NET works seamlessly with other Windows applications. Getting your various products to play nice together and work in tandem will be simple. And you won’t have to worry about an update to ASP.NET or any of your other Windows applications screwing up compatibility. Windows will make sure that updated versions of its various products and applications still work well together, even as they all evolve over time. 7. It offers support for multiple languages. Programmers using ASP.NET have a couple of different programming languages they can choose from: C# and VB.net. C# in particular is a popular option with many developers because it’s powerful, flexible, and easy to learn.  It’s one of the most popular programming languages today and is known for being particularly well suited for building Microsoft applications, games, and mobile development. 8. It’s now compatible with all servers. Some articles on ASP.NET list one of the main disadvantages as being that it only works with Windows servers. In fact, several years ago Windows released the ASP.NET Core which made the program compatible with all types of servers—Linux, MacOS, and Windows. While it still may work best with a Windows server, since it was initially designed with that compatibility in mind, you can use ASP.NET no matter which type of web server you prefer. 9. It’s supported by Microsoft. Microsoft is one of the biggest and most powerful tech companies in the world. Any product that has their backing can count on regular maintenance, updates, and improvements. With some free products, there’s always the risk that their creators will stop supporting them and anyone dependent on them will have to start from scratch, but ASP.NET has the power of a company that’s not going anywhere behind it. 10. It’s got a great reputation for security. One of the main areas where most experts agree that the ASP.NET service beats PHP is for security. The program supports multi-factor authentication protocols that allow users to control who has access to the website or app they create with the framework. And ASP.NET includes built-in features that protect against common hacker protocols like cross-site scripting (XSS), SQL (structured query language) injection attacks, open redirect attacks, and cross-site request forgery (CSRF). Website security is an increasingly important issue for all website owners to consider, especially as hacks and high-profile data breaches become more common. Choosing ASP.NET is one of several steps you can take to make your website more secure. 5 Cons of Using ASP.NET That’s a long list of pros, which may have you wondering why so many people still choose PHP over ASP.NET. It’s not all positives, there are a few downsides to choosing ASP.NET as well. 1. It’s not compatible with fewer CMSes than PHP. One of the main reasons that some people prefer PHP is that it works with popular content management systems like WordPress. For people more comfortable using a CMS, which makes creating and updating a website easier if you don’t know how to code, ASP.NET puts a serious limitation in their path. With over a quarter of the entire internet running on WordPress, and content management systems like Drupal and Joomla powering much of the web as well, that makes PHP the natural choice for a majority of websites.   2. It has fewer templates and plugins. Because ASP.NET has fewer users, it also has fewer extras. With fewer people to develop useful features like templates and plugins, there just aren’t as many available to users of ASP.NET. These kinds of extras extend the functionality of a program and can make it easier for people to create the exact kind of website or app they want. While there are still definitely options you can take advantage of with ASP.NET, fewer choices means getting your website where you want it to be will be harder. 3. It’s potentially expensive if you’re not already using Windows. As we already mentioned, using ASP.NET is technically free. But using it tends to make the most sense for companies that already have access to a number of Windows products. One of the big benefits it offers is working seamlessly with all those other Windows solutions, so if you need something a Windows product offers while working on your website in ASP.NET, you’ll likely have to shell out for an additional product. Not everyone that uses ASP.NET will feel the need to spend money on other Windows solutions, but some will. If you end up deciding you need the additional functionality various Windows products provide, the cost can quickly add up. 4. It has a smaller community than PHP. While ASP.NET has a community that’s devoted, it’s much smaller than the community that uses PHP. That means fewer support resources and fewer developers working to make the framework better. It also means businesses will find it harder to find professional developers that are skilled in ASP.NET than PHP (although far from impossible). And you won’t have as many forums or user groups to turn to with questions. While that is an inconvenience, there is enough of a community out there that you may not feel a lack if you do choose to go with ASP.NET. But if having a supportive community is an important part of your decision when choosing what to build your website or app with, other options beat ASP.NET in this category. 5. It’s harder to learn than PHP. ASP.NET is relatively easy for developers to learn, but it has more of a learning curve than PHP. And because you can’t use intuitive content management systems like WordPress with it, it’s generally out of reach for many beginners that can’t afford to learn programming languages themselves or hire a professional when building out their website. For big businesses with a budget to put toward building a website or app, this is likely to be a non-issue since finding skilled ASP.NET programmers to hire won’t be too hard. But for smaller businesses and individuals building a more basic website, it’s a good reason to pick a simpler solution. What Is ASP.NET Hosting? Now that we’ve covered the basics of what ASP.NET itself is, we come back around to the main question at hand: what is ASP.NET web hosting? ASP.NET hosting is any web hosting plan designed to be compatible with ASP.NET. In many cases, that means Windows hosting, but since ASP.NET is now compatible with other types of servers, it doesn’t have to mean that. Two main things define ASP.NET hosting services: 1. It promises compatibility with ASP.NET and all associated web applications. ASP.NET hosting solutions must provide seamless compatibility with ASP.NET itself. But you’ll also want to make sure your web hosting plan provides compatibility with other web applications you’re likely to use with ASP.NET, such as the Plesk Control Panel and any other Windows products you use.   2. It has an easy installation option. A good ASP.NET hosting plan will include simple one-click installation that lets you add ASP.NET to your web hosting platform within seconds. You have enough work to do building your website, game, or app—you don’t have time to spend on a complicated installation process. A good ASP.NET hosting option ensures you don’t have to spend any longer on this step than necessary. What to Look for in an ASP.NET Web Hosting Plan If you determine that using ASP.NET is the best option for your website, then an ASP.NET hosting plan is a smart choice. When researching your options, look for a web hosting plan that includes: A 99.9% Uptime Guarantee – Uptime is the amount of time your website is working and accessible to visitors. It’s one of the main differentiating factors between different web hosting companies. The best companies promise at least 99.9% uptime and back that claim up with a money-back guarantee. 24/7 Customer Support – The moment you have an issue with your website, you want to get it fixed. 24/7 customer support means you can reach someone right away and get the problem taken care of faster. Plenty of Bandwidth – Look for an ASP.NET hosting provider that offers plans at different levels, especially if your website or app will need a significant amount of bandwidth. If you need it, make sure you can get an enterprise-level plan compatible with ASP.NET.A Reputation for Security – Choosing ASP.NET to build your website is one smart step you can take for security, choosing the right web hosting provider is another. A web hosting provider that uses strong firewalls and offers security features like an SSL certificate that will provide an extra level of protection that keeps your website and its visitors safer. HostGator’s ASP.NET web hosting services offer everything on the list. We make it easy to add ASP.NET to your hosting account so you can get started faster. And we have one of the top reputations of any web hosting company in the industry. If you’re still not sure about the right web hosting provider or company for your ASP.NET website, our sales representatives and support team are available 24/7 to answer any questions you have. If you’re looking into a different service like dedicated server hosting, cloud hosting, or shared hosting plans, our experienced team can help you find the best package for your needs. Find the post on the HostGator Blog

Comparing Shared Hosting vs VPS: Which Is Right For Me?

Liquid Web Official Blog -

There are an almost limitless number of options available for website hosting, especially if you have a number of websites. You can pack all of your sites into a single Shared hosting plan, utilize a reseller-style hosting which allows you multiple Shared hosting accounts, or get full assisted control of your hosting with a Virtual Private Server (VPS). But when is it better to stay on a Shared Hosting plan compared to upgrading to a small VPS Hosting plan? Here are some key questions to consider: How much disk space does your site need, and how quickly do you expect that to grow? Does your site need a higher memory limit than average, or does it require more processing power? Are there additional server-side applications that you need for back-end processing, like LaTeX, FFMPEG, ImageMagick, or Java? Do you have an application that runs exclusively on Windows, or would a less expensive Linux package work just as well? How much bandwidth do you currently use or expect to use for all of your websites? What kind of content do you have on your sites (online shopping, static content, private information, blog, etc)? Get industry-leading tips on getting the right hosting infrastructure for your business. Subscribe to the Liquid Web newsletter and get content like this sent straight to your inbox. Shared Hosting Vs VPS Hosting: Six Areas To Consider Let’s dive into six core areas to consider as you make your decision: Traffic volume Management level Controllability Resource availability Scalability Price Traffic Volume The amount of bandwidth in and out of your server is one consideration. Inbound bandwidth is usually less important than outbound bandwidth because unless your visitors will be uploading a lot of data, inbound HTTP requests will be small in size compared to the documents and images that your site will return for each page request. Shared hosting platforms are usually not set up for high volumes of traffic and processing since the power of the server must be distributed between dozens, or sometimes hundreds or thousands of other users and websites. But, for average sized and trafficked sites, such as hobby sites or “pamphlet” information-only domains, or even small blogs, shared hosting is perfectly acceptable. Sites that require more intense server-side functions, like online stores or sites which generate documents such as invoices or quotes, or sites which convert audio or video on the fly, may need more resources allocated than would come with your average Shared hosting account. Additionally, sites which have higher outbound bandwidth, like those that serve up audio or documents to users, will need additional bandwidth (and disk space) that Shared hosting may not provide, and a VPS would be better in those cases. The plain number of visitors or page loads on your site may not completely describe the processing and bandwidth needs of your site. If the site is not properly optimized for processing, the server will have to work harder for each page load. And, if you utilize a Content Delivery Network (CDN), then your outbound bandwidth usage will be considerably lower since images and other static files will be served from other locations. Comfort Level Once you have your list of requirements, think about your comfort level with controlling your hosting. In the realm of both VPS and Shared hosting, there is a breadth of support types available. If you prefer a hands-off approach, you might want someone else to monitor the services on your server, help you install programs, troubleshoot server issues, and make adjustments to configurations. So, a fully-managed hosting package with a server control panel might be better, though it comes at a slightly higher cost. If you are comfortable working on your own server and have some command line knowledge, an unmanaged VPS without a control panel could save some support and licensing costs. Most Shared hosting will be fully managed since you will not have the access levels necessary to manage the machine yourself. Advantageously, some hosting providers may specialize in one type of website hosting, such as supporting Joomla sites or assisting with commerce site integrations. If you know you will need assistance in the future with your specific hosting type, it may be worthwhile to seek out providers that could assist you with your particular needs. Controllability This leads us to a major difference between VPS and Shared hosting. If you need to have specific software installed, or need special configurations on your server, it could be uncommon to find a Shared hosting package that includes exactly that feature set. (though it is common to find hosting providers that will already have installed popular software, like FFMPEG or ImageMagick) And, it would also be unlikely that your host would install a special package for you on a shared machine, which could pose a security risk to other tenants. Therefore, a Shared hosting package would have low controllability. A VPS, on the other hand, gives you complete access into your system, so that you can enable, disable, install, or remove any software you wish, and adjust configurations exactly to your specifications. So, you aren’t restricted to the software that your hosting provider gives to your environment. Resource Availability A shared hosting package is, of course, shared amongst multiple occupants. Therefore, if you have a “noisy neighbor” who is overusing CPU time or eating up memory, then there will be less available for the remaining websites, including yours, causing them to suffer in performance. Modern Shared hosting providers will combat this by introducing resource limitations, such as maximum RAM usage, maximum number of processes, and maximum CPU percentage. These work to combat the “noisy neighbor” problem, but could limit you from temporarily overusing resources to run, say, statistics, or compile your nightly order list. Being able to temporarily break these shared resource limits is called “bursting”, which is an option for some hosts. To a much lesser extent, the noisy neighbor issue is also present on Virtual Private Servers that have multiple tenants per server node. Multiple virtual servers can be run on one physical server, but modern hypervisors (the software that runs the parent machine) are intelligent enough to silo VPSs very well, and even if one VPS is going hard and running out of memory, even to the point of having a kernel panic or halt state, the other VPSs on the parent machine will generally take no notice at all. But, several hosts also offer bursting of CPU and RAM for VPSs, which can still affect your own private server. There are “Virtual Dedicated” packages available at some hosts which provide all of the resources on one parent (dedicating it to your VPS) to avoid noisy neighbors but retaining the hypervisor’s scalability and management. Scalability Shared server packages are generally not very elastic. Options for changing the resources on your package generally include increasing your disk quota, and in some cases, removing limits on your CPU access. However, more meaningful adjustments to your resources would necessitate migration of your account to a more powerful server, or if one is not available, upgrading to a VPS or Dedicated hosting package, a task which takes considerable time. VPSs will have more functionality available for adding or removing resources, including CPU cores, system memory, and additional disks or disk space through your hosting provider. If you are, for instance, running a promotion in which you expect to receive considerable extra traffic, then a VPS will afford you the ability to scale up your server size, adjust your server-side settings to utilize the new resources, and once traffic has tapered, resize back down to your original values. Price One of the major differences between VPS hosting and Shared hosting is the average price of each platform. Shared hosting could be had for anywhere between $2 and $30 a month from various vendors, while VPSs start somewhere around $30, with nearly no upper price boundary. With these various price points come varying amounts of resources, including support, Memory/CPU resources, disk space, and bandwidth. Different hosting providers may provide different price points for seemingly identical resource availability, but make sure you discern these differences carefully. Find out what kind of scalability is available, if there are any baked-in backup solutions for the platform, support response times, and what portions of the hosting platform are managed. You should also ask to see what self-service documentation is present and whether you can preview the control panel and management interface for your hosting. Finally, see if there are extra costs necessary for any of these add-ons that could affect your final monthly or yearly hosting costs. Which One Is Right For You? There are strong advantages to both Shared and VPS hosting, and there is no perfect catch-all answer for which you should pick; your hosting needs to be tailored to the current and future needs of your websites. But, resources and costs are always driving factors. If you already have multiple Shared hosting accounts for multiple domains, you could save a good deal of money by combining them into a single VPS. And, if you feel your Shared hosting service is limiting your site’s performance, upgrading to a VPS can unleash its full potential by allowing you to tune settings specific to your needs. If you are hosting just one or two domains that don’t have outrageous requirements, a Shared hosting package could suit you perfectly. Cloud VPS At Liquid Web Cloud VPS at Liquid Web is built for reliability and performance. It’s faster than AWS or Rackspace and comes standard with backups, security, and our fully managed guarantees. The post Comparing Shared Hosting vs VPS: Which Is Right For Me? appeared first on Liquid Web.

How to Get LinkedIn Leads Without Advertising

Social Media Examiner -

Do you need to generate more leads and prospects? Wondering how to identify and nurture leads organically on LinkedIn? In this article, you’ll find a three-step plan to develop profitable relationships with people on LinkedIn, without spending any money on ads. LinkedIn’s Role in the Sales Funnel I love LinkedIn, but the platform has long […] The post How to Get LinkedIn Leads Without Advertising appeared first on Social Media Marketing | Social Media Examiner.

How to Change the PHP Version in cPanel for your VPS Package

Reseller Club Blog -

VPS (Virtual Private Server) Hosting, as we have covered in our previous articles, is a powerful and dynamic hosting infrastructure combining the qualities of both Shared and Dedicated Hosting. It allows you to manage your server seamlessly with complete root access. Since VPS is a self-managed server, it is important that you know how to enable various features on your VPS package. In this tutorial, we will cover, how to change the PHP version in cPanel for your VPS package, however, before we get to the tutorial let us first have a look at what PHP is. PHP stands for Hypertext Preprocessor (earlier called, Personal Home Page). It is an open-source, server-side scripting language that enables developers to develop web applications and can be embedded directly into an HTML code. Moreover, PHP also helps create dynamic web pages for web apps, e-commerce based apps, as well as, database applications. Along with this, it is platform independent and connects with several databases like MySQL, Oracle, PostgreSQL, etc. Now, that we’ve covered the basics of PHP let us move on to understanding how to change the PHP version in cPanel. Follow these steps to know how to change the PHP version in cPanel for a particular domain hosted under the VPS package: Login to your Reseller Account Login to your ResellerClub control panel to see how many active orders, expiring orders are there in your account. Accessing your Products To access your orders, go to your control panel dashboard and click on Products → List All Orders (image 1) → Click on the Domain Name associated with the VPS package you want to access (image 2) image 1image 2 Accessing the VPS Linux KVM package After clicking on the Domain Name, you’ll be redirected to the ‘Domain Overview’ page (image 3). Scroll down to the VPS Linux KVM tab. Here click on Admin Details, and then click on the cPanel link to login to the WHM panel (image 4) image 3image 4 Accessing your WHM Panel After clicking on the cPanel link, the WHM page opens up. Here enter the Username and Password for the WHM Panel, and click on Login In the WHM Panel On the WHM dashboard on the Home page click on Software MultiPHP Manager There are a bunch of software plugins in this tab, to change your PHP version click on the ‘MultiPHP Manager’ plugin In MultiPHP Manager Plugin After clicking on the plugin, select the Domain name for which you wish to modify the PHP version. This can be found at the bottom of the page. Note: Do not change anything else on the entire page Set PHP Version per Domain Click on the drop-down option for ‘PHP Version’ and select the desired PHP version from the list of available PHP versions. After selection click on ‘Apply’ After this step, you will see your PHP version of the domain name has changed. Conclusion: With this, we come to an end of our quick tutorial on, ‘how to change the PHP version in cPanel for your VPS package’. If you have any suggestions, queries, or questions feel free to leave a comment below and we’ll get back to you. Until next time, folks! .fb_iframe_widget_fluid_desktop iframe { width: 100% !important; } The post How to Change the PHP Version in cPanel for your VPS Package appeared first on ResellerClub Blog.

Join Cloudflare & PicsArt at our meetup in Yerevan!

CloudFlare Blog -

Cloudflare is partnering with PiscArt to create a meetup this month at PicsArt office in Yerevan.  We would love to invite you to join us to learn about the newest in the Internet industry. You'll join Cloudflare's users, stakeholders from the tech community, and Engineers from both Cloudflare and PicsArt.Tuesday, 4 June, 18:30-21:00PicsArt office, YerevanAgenda:18:30-19:00   Doors open, food and drinks    19:00 - 19:30   Areg Harutyunyan, Engineering Lead of Argo Tunnel at Cloudflare, "Cloudflare Overview / Cloudflare Security: How Argo Tunnel and Cloudflare Access enable effortless security for your team"19:30-20:00    Gerasim Hovhannisyan, Director IT Infrastructure Operations at PicsArt, "Scaling to 10PB Content Delivery with Cloudflare's Global Network"20:00-20:30   Olga Skobeleva, Solutions Engineer at Cloudflare, "Security: the Serverless Future"20:30-21:00   Networking, food and drinksView Event Details & Register Here »We'll hope to meet you soon. Here are some photos from the meetup at PicsArt last year:

Stopping SharePoint’s CVE-2019-0604

CloudFlare Blog -

On Saturday, 11th May 2019, we got the news of a critical web vulnerability being actively exploited in the wild by advanced persistent threats (APTs), affecting Microsoft’s SharePoint server (versions 2010 through 2019).This was CVE-2019-0604, a Remote Code Execution vulnerability in Microsoft SharePoint Servers which was not previously known to be exploitable via the web.Several cyber security centres including the Canadian Centre for Cyber Security and Saudi Arabia’s National Center put out alerts for this threat, indicating it was being exploited to download and execute malicious code which would in turn take complete control of servers.The affected software versions:Microsoft SharePoint Foundation 2010 Service Pack 2Microsoft SharePoint Foundation 2013 Service Pack 1Microsoft SharePoint Server 2010 Service Pack 2Microsoft SharePoint Server 2013 Service Pack 1Microsoft SharePoint Enterprise Server 2016Microsoft SharePoint Server 2019IntroductionThe vulnerability was initially given a critical CVSS v3 rating of 8.8 on the Zero Day Initiative advisory (however the advisory states authentication is required). This would imply only an insider threat, someone who has authorisation within SharePoint, such as an employee, on the local network could exploit the vulnerability.We discovered that was not always the case, since there were paths which could be reached without authentication, via external facing websites. Using the NIST NVD calculator, it determines the actual base score to be a 9.8 severity out of 10 without the authentication requirement.As part of our internal vulnerability scoring process, we decided this was critical enough to require immediate attention. This was for a number of reasons. The first being it was a critical CVE affecting a major software ecosystem, primarily aimed at enterprise businesses. There appeared to be no stable patch available at the time. And, there were several reports of it being actively exploited in the wild by APTs.We deployed an initial firewall rule the same day, rule 100157. This allowed us to analyse traffic and request frequency before making a decision on the default action. At the same time, it gave our customers the ability to protect their online assets from these attacks in advance of a patch.We observed the first probes at around 4:47 PM on the 11th of May, which went on until 9:12 PM. We have reason to believe these were not successful attacks, and were simply reconnaissance probes at this point.The online vulnerable hosts exposed to the web were largely made up of high traffic enterprise businesses, which makes sense based on the below graph from W3Techs.Figure 1: Depicts SharePoint’s market position (Image attributed to W3Techs)The publicly accessible proof of concept exploit code found online did not work out of the box. Therefore it was not immediately widely used, since it required weaponisation by a more skilled adversary.We give customers advance notice of most rule changes. However, in this case, we decided that the risk was high enough that we needed to act upon this, and so made the decision to make an immediate rule release to block this malicious traffic for all of our customers on May 13th.We were confident enough in going default block here, as the requests we’d analysed so far did not appear to be legitimate. We took several factors into consideration to determine this, some of which are detailed below.The bulk of requests we’d seen so far, a couple hundred, originated from cloud instances, within the same IP ranges. They were enumerating the subdomains of many websites over a short time period.This is a fairly common scenario. Malicious actors will perform reconnaissance using various methods in an attempt to find a vulnerable host to attack, before actually exploiting the vulnerability. The query string parameters also appeared suspicious, having only the ones necessary to exploit the vulnerability and nothing more.The rule was deployed in default block mode protecting our customers, before security researchers discovered how to weaponise the exploit and before a stable patch from Microsoft was widely adopted.The vulnerabilityZero Day Initiative did a good job in drilling down on the root cause of this vulnerability, and how it could potentially be exploited in practice.From debugging the .NET executable, they discovered the following functions could eventually reach the deserialisation call, and so may potentially be exploitable.Figure 2: Depicts the affected function calls (Image attributed to Trend Micro Zero Day Initiative)The most interesting ones here are the “.Page_Load” and “.OnLoad” methods, as these can be directly accessed by visiting a webpage. However, only one appears to not require authentication, ItemPicker.ValidateEntity which can be reached via the Picker.aspx webpage.The vulnerability lies in the following function calls:EntityInstanceIdEncoder.DecodeEntityInstanceId(encodedId); Microsoft.SharePoint.BusinessData.Infrastructure.EntityInstanceIdEncoder.DecodeEntityInstanceId(pe.Key); Figure 3: PickerEntity Validation (Image attributed to Trend Micro Zero Day Initiative)The PickerEntity ValidateEntity function takes “pe” (Picker Entity) as an argument.After checking pe.Key is not null, and it matches the necessary format: via a call toMicrosoft.SharePoint.BusinessData.Infrastucture.EntityInstanceIdEncoder.IsEncodedIdentifier(pe.Key) it continues to define an object of identifierValues from the result ofMicrosoft.SharePoint.BusinessData.Infrastructure.EntityInstanceIdEncoder.DecodeEntityInstanceId(pe.Key) where the actual deserialisation takes place.Otherwise, it will raise an AuthenticationException, which will display an error page to the user.The affected function call can be seen below. First, there is a conditional check on the encodedId argument which is passed to DecodeEntityInstanceId(), if it begins with __, it will continue onto deserialising the XML Schema with xmlSerializer.Deserialize().Figure 4: DecodeEntityInstanceId leading to the deserialisation (Image attributed to Trend Micro Zero Day Initiative)When reached, the encodedId (in the form of an XML serialised payload) would be deserialised, and eventually executed on the system in a SharePoint application pool context, leading to a full system compromise.One such XML payload which spawns a calculator (calc.exe) instance via a call to command prompt (cmd.exe):<ResourceDictionary xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:System="clr-namespace:System;assembly=mscorlib" xmlns:Diag="clr-namespace:System.Diagnostics;assembly=system"> <ObjectDataProvider x:Key="LaunchCalch" ObjectType="{x:Type Diag:Process}" MethodName="Start"> <ObjectDataProvider.MethodParameters> <System:String>cmd.exe</System:String> <System:String>/c calc.exe</System:String> </ObjectDataProvider.MethodParameters> </ObjectDataProvider> </ResourceDictionary> AnalysisWhen we first deployed the rule in log mode, we did not initially see many hits, other than a hundred probes later that evening.We believe this was largely due to the unknowns of the vulnerability and its exploitation, as a number of conditions had to be met to craft a working exploit that are not yet widely known.It wasn’t until after we had set the rule in default drop mode, that we saw the attacks really start to ramp up. On Monday the 13th we observed our first exploit attempts, and on the 14th saw what we believe to be individuals manually attempting to exploit sites for this vulnerability.Given this was a weekend, it realistically gives you 1 working day to have rolled out a patch across your organisation, before malicious actors started attempting to exploit this vulnerability.Figure 5: Depicts requests matched, rule 100157 was set as default block early 13th May.Further into the week, we started seeing smaller spikes for the rule. And on the 16th May, the same day the UK’s NCSC put out an alert reporting of highly successful exploitation attempts against UK organisations, thousands of requests were dropped, primarily launched at larger enterprises and government entities.This is often the nature of such targeted attacks, malicious actors will try to automate exploits to have the biggest possible impact, and that’s exactly what we saw here.So far into our analysis, we’ve seen malicious hits for the following paths:/_layouts/15/Picker.aspx/_layouts/Picker.aspx/_layouts/15/downloadexternaldata.aspxThe bulk of attacks we’ve seen have been targeting the unauthenticated Picker.aspx endpoint as one would expect, using the ItemPickerDialog type:/_layouts/15/picker.aspx?PickerDialogType=Microsoft.SharePoint.Portal.WebControls.ItemPickerDialog We expect the vulnerability to be exploited more when a complete exploit is publicly available, so it is important to update your systems if you have not already. We also recommend isolating these systems to the internal network in cases they do not need to be external facing, in order to avoid an unnecessary attack surface.Sometimes it’s not practical to isolate such systems to an internal network, this is usually the case for global organisations, with teams spanning multiple locations. In these cases, we highly recommend putting these systems behind an access management solution, like Cloudflare Access. This gives you granular control over who can access resources, and has the additional benefit of auditing user access.Microsoft initially released a patch, but it did not address all vulnerable functions, therefore customers were left vulnerable with the only options being to virtually patch their systems or shut their services down entirely until an effective fix became available.This is a prime example of why firewalls like Cloudflare’s WAF are critical to keeping a business online. Sometimes patching is not an option, and even when it is, it can take time to roll out effectively across an enterprise.

What are “Carrier Hotels” and Why Are They Valuable to Your Business?

The Rackspace Blog & Newsroom -

60 Hudson Street in New York City occupies an entire city block in Manhattan — a pretty valuable bit of real estate. Originally known as the Western Union building, this 1930 art-deco-style beauty once served as the technological center of the global telegraph system. Today, it serves as the modern version of a technological epicenter, […] The post What are “Carrier Hotels” and Why Are They Valuable to Your Business? appeared first on The Official Rackspace Blog.

Craft CMS vs WordPress: Which CMS Is Content King?

Nexcess Blog -

A content-heavy site is about more than just creating content. It’s about managing it. Site owners need to ask themselves what type of user experience they want to deliver. This article will look at two distinct options on the market: Craft CMS and WordPress. WordPress, the most popular CMS around, is free and open source.… Continue reading →

What Are the Limitations of Shared Hosting?

InMotion Hosting Blog -

If you do an online search for a hosting plan, then chances are you will be inundated with “cheap” shared hosting options from a considerable number of companies. For first-time website owners are looking to dip their big toe into the internet pool, and these aren’t bad plans to have.   However, as with anything, there are some downsides to using shared hosting. Let’s look at the limitations of a shared plan so that you can make an educated choice on whether it will work for you or if you need to step up to a better plan. Continue reading What Are the Limitations of Shared Hosting? at The Official InMotion Hosting Blog.

What Is Windows Hosting?

HostGator Blog -

The post What Is Windows Hosting? appeared first on HostGator Blog. When figuring out which web hosting plan is the right choice for your business, you have two main server options: Windows hosting and Linux hosting. If you don’t know anything about what they are and how they’re different, you could risk making the wrong decision out of the gate, and face trouble down the line getting your website working the way you’d like. To help you make the best decision for your business, we’ve put together this guide on what Windows hosting is, who it’s for, and what to look for in a Windows hosting provider. What Is Windows Hosting? Windows hosting is website hosting that uses a Windows operating system. Because the most common web hosting plan options operate on Linux, you can usually assume that anytime a web hosting company doesn’t specify a plan is Windows hosting, it’s Linux hosting. But for a certain subset of website owners, Windows server hosting is the better choice and it’s important to seek out a plan that provides the particular features that come with it. 3 Pros of Windows Hosting Windows hosting isn’t for everyone, but it offers some unique benefits for the businesses that it’s a good fit for. 1. It runs on the familiar Windows operating system (OS). Windows is the most popular operating system in the world. The current OS version alone has over 400 million users, and some estimates put the total user base at more than 1 billion. The Windows system is familiar, and using it is straightforward for millions of people around the world. That said, when it comes to web hosting, most website owners don’t access the web server’s operating system directly. With Linux hosting, you typically use the cPanel, and with Windows server hosting you have the option of the Plesk control panel. But for anyone who plans to use the server user interface directly, Windows hosting provides a much more intuitive user experience than the Linux user interface. 2. It provides compatibility with other Windows tools. For a lot of business, this is the main reason to choose Windows hosting. If your business depends on a number of other Windows tools, then choosing a web hosting platform that plays nice with the various other software products and solutions you depend on will make your life easier. If your website was built using ASP.NET web hosting, then you’ll need to use a Windows server.. Companies that use a Microsoft structured query language (SQL) server for their website and databases will also need to stick with Windows server hosting. Same thing goes if your business uses Microsoft Exchange for your company’s email server and Microsoft Sharepoint for your project management and team collaboration. Basically, the more your business depends on Windows programs, the more likely you are to need Windows server hosting. You can always trust that your web hosting platform will be compatible with all your other legacy programs. And notably, you can trust that whenever your OS and related programs have an update, you don’t have to worry about losing functionality of other programs that depend on them since everything comes from the same company.   Another point worth considering is that most businesses that use a number of Windows-based tools will already employ a lot of tech support professionals that are experts on using, maintaining, and updating Windows programs. The people that already know how to manage your Windows products will have no problem also using your Windows server hosting. 3. It comes with the easy-to-use Plesk control panel. Where Linux web hosting has the cPanel, Windows hosting has the Plesk control panel (although it should be noted that Plesk is also an option available with Linux hosting, just not as commonly used with it as cPanel).  While many users that are already familiar with the OS can directly use the Windows server hosting interface, those that want something a little more user friendly can count on the Plesk control panel to make it easier to make updates and changes to your website. Plesk provides an intuitive user interface that lets you create and manage multiple websites and domains, set up email accounts, and manage reseller accounts. Unlike cPanel, it’s more specifically focused on the needs of commercial website and app owners. It’s compatible with content management systems (CMS) like WordPress, Drupal, and Joomla. And Plesk offers a number of different extensions companies can use to add more functionality to the control panel, including many that increase the website’s security, improve website performance, or add new applications to the website. 3 Cons of Windows Hosting While the benefits of Windows server hosting are significant, there’s a reason that Linux is the default for web hosting solutions.. Windows hosting is great for what it does, but it has a few notable downsides as well. 1. It’s not as secure as Linux hosting. Websites that run on Windows have been the victims of ransomware attacks in recent years with more frequency than Linux ones have. For that reason, Linux hosting is widely considered to be the more secure option for websites. For business websites, security is an important consideration, especially if you sell products through the site and thus collect sensitive financial information from your customers. But even if you don’t have an eCommerce website, if hackers take your website down for hours or days, it’s bad for business and for your overall reputation. While Linux web hosting beats Windows hosting in the security category, which of the two you pick is just one of many factors that influence website security. Even going with Windows hosting, you can do a lot to keep your website protected from hackers by making sure the web hosting company you choose invests in basic precautions, adding additional security software or extensions to your website, keeping all the website software you use up to date, and being careful about the levels of access you provide different people working on the site. 2. It’s more expensive. Windows hosting does cost a bit more than Linux hosting, but the difference is fairly minimal. For instance, HostGator’s Linux-based plans start at $2.75 a month, and our Windows hosting plans start at $4.76. For both types of web hosting plans, the costs do go up as the website’s needs increase, but even for an enterprise-level plan, Windows hosting only costs $14.36 a month. For most of the businesses making a decision—especially for enterprise businesses that already depend on a number of Windows products and systems—those numbers should be manageable, even as Windows hosting costs more. 3. It can be less stable. Linux servers are known for being extremely reliable. They rarely need to be rebooted and can smoothly handle many functions at a time. Windows servers, in contrast, tend to have a little more trouble consistently handling a large number of apps and tasks at a time without interruption. As with any technology on the web, Windows servers have improved in this area over time and can be expected to continue to do so, but they’re generally not quite as consistent in their performance as Linux servers. That said, this is another area where choosing the right web provider can make a big difference. A Windows hosting company that includes a 99.9% uptime guarantee is promising they’ve taken all the precautions to keep their Windows servers working as consistently as possible. With the right attention to maintenance and preparation, a good web hosting provider can help overcome the difference in server reliability between Windows and Linux hosting. Who Should Use Windows Hosting? For most new businesses starting out on the web, getting a shared hosting or cloud hosting plan based on a Linux server makes the most sense because of its affordability and reliability. Where Windows hosting really makes the most sense is for businesses—mostly enterprise or other large businesses that have been around for awhile—that have built their website and other systems on Windows programs. If a lot of your business’s tech relies on Windows, then Windows server hosting is the natural choice. It will work seamlessly with all the other programs you use, and won’t require you to rebuild or work on anything from scratch. And the tech professionals your company relies on to keep all your systems working properly will already know how to work with the Windows hosting OS. In short, if Windows hosting is the right choice for your company, your IT team will likely have strong feelings on the subject. Make sure you bring them into the conversation and let their input determine your web hosting choice. What to Look for in a Windows Hosting Provider If you’ve determined you do need Windows server hosting, then it’s important to find the right web hosting company and plan for your needs. When researching your options, here are a few good features to look for. Software Compatibility Any Windows hosting plan should provide the proper compatibility you need for all your Windows-based programs and software, including: ASP ClassicASP.NETMSSQL (Microsoft SQL Server)MS Access (Microsoft Access)Visual Basic DevelopmentC#Remote Desktop Microsoft ExchangeMicrosoft SharePoint If your business is reliant on any of these though, it doesn’t hurt to double-check that the Windows web hosting plan you choose will work seamlessly with them. In addition, if you use a popular CMS, like WordPress, check that the web hosting plan is compatible with that as well. If you add website tools like Google Analytics and AWstats, or eCommerce software like Magento or Woocommerce for an online store, be sure to check those as well.. Often, web hosting companies will advertise compatibility with common software solutions on the website, but if you don’t find the information there, you can check with the sales team for more information. Stability While as previously mentioned, Windows hosting has a weaker reputation for stability than Linux, finding the right web hosting provider can make all the difference in ensuring your website works consistently. The thing to look for here is a promise of at least 99.9% uptime. That’s how often the company guarantees your website will be up and accessible to your visitors. Some companies, like HostGator, even back that promise up with a money-back guarantee, so you know they’re serious. Even if Windows servers aren’t quite as reliable as Linux ones, the right company that knows how to take proper care of them can help make up the difference. Security Every business has to prioritize website security. The stakes are too high not to. Windows hosting may have more vulnerabilities than Linux hosting, but by choosing a reputable provider with a strong reputation for security, you can avoid much of the risk. Look for a company that uses firewalls to protect their servers, and that provides SSL certificates as part of their offerings. And check if they provide additional security software or other options to help you further protect your website from hackers. Reputation Reviewing the different plans available from various web hosting companies and the promises they make is just one part of making an informed decision. Also look into their reputation in the larger industry. Check out websites with third-party reviews to gain an unbiased look at the company’s reputation and determine if it’s in line with what you’re looking for. Customer Service Even if you have a tech team full of Windows experts, you may occasionally need customer service help from your web hosting provider. Confirm that your web hosting choice offers 24/7 customer service. If your website ever fails, you need to be sure you can get it working again right away. And make sure they offer multiple ways to get in touch. You should be able to use the communication format of your choice, whether that’s phone, live chat, customer portal, or email. Find the Best Windows Hosting Plan for You If Windows hosting is the right choice for your company, make sure you find the particular Windows hosting plan and provider that offers what you need. HostGator’s Windows hosting solutions cover all the bases we’ve covered here, and you can choose between two levels based on whether you’re running a SMB business or have a larger enterprise company. Either way, you can count on the compatibility and features you need at a reasonable price. Whether you are looking to do Windows hosting, Virtual Private Server hosting, or dedicated server hosting, HostGator has got you covered. Contact our team of experts today to get started. Find the post on the HostGator Blog

Pages

Recommended Content

Subscribe to Complete Hosting Guide aggregator