How to Stop Your Website Going Offline

How to prevent website downtime

Why Public Cloud is Best for Big Data

2018-03-05 13:14:22 | Technology


If you are one of the growing number of companies using Big Data, this post will explain the benefits and challenges of migrating to the public cloud and why it could be the ideal environment for your operations.

Cloud-based Big Data on the rise

According to a recent survey by Oracle, 80% of companies are planning to migrate their Big Data and analytics operations to the cloud. One of the main factors behind this was the success that these companies have had when dipping their toe into Big Data analytics. Another survey of US companies discovered that over 90% of enterprises had carried out a big data initiative last year and that in 80% of cases, those projects had highly beneficial outcomes.

Most initial trials with Big Data are carried out in-house. However, many of those who find it successful want to expand their Big Data operations and see cloud as a better solution. The reason for this is that the IaaS, PaaS and SaaS solutions offered by cloud vendors are much more cost effective than developing in-house capacity.

One of the issues with in-house, Big Data analyses is that it frequently involves the use of Hadoop. Whilst Apache’s open source software framework has revolutionised storage and Big Data processing, in-house teams find it very challenging to use. As a result, many businesses are turning to cloud vendors who can provide Hadoop expertise as well other data processing options.

The benefits of moving to the public cloud

One of the main reasons for migrating is that public cloud Big Data services provide clients with essential benefits. These include on-demand pricing, access to data stored anywhere, increased flexibility and agility, rapid provisioning and better management.

On top of this, the unparalleled scalability of public cloud means it is ideal for handling Big Data workloads. Businesses can instantly have all the storage and computing resources they need and only pay for what they use. Public cloud can also provide increased security that creates a better environment for compliance.

Software as a service (SaaS) has also made public cloud Big Data migration more appealing. By the end of 2017, almost 80% of enterprises had adopted SaaS, a rise of 17% from 2016, and over half of these use multiple data sources. As the bulk of their data is stored in the cloud, it makes good business sense to analyse it there rather than go through the process of moving back to an in-house datacentre.

The other benefit of public cloud is the decreasing cost of data storage. While many companies might currently think the cost of storing Big Data over a long period is expensive compared to in-house storage, developments in technology are already bringing down the costs and this will continue to happen in the future. At the same time, you will see vast improvements in the public cloud’s ability to process that data in greater volumes and at faster speeds.

Finally, the cloud enables companies to leverage other innovative technologies, such as machine learning, artificial intelligence and serverless analytics. The pace of developments these bring means that those companies who are late adopters of using Big Data in the public cloud find themselves at a competitive disadvantage. By the time they migrate, their competitors are already eating into their market.

The challenge of moving Big Data to public cloud

Migrating huge quantities of data to the public cloud does raise a few obstacles. Integration is one such challenge. A number of enterprises find it difficult to integrate data when it is spread across a range of different sources and others have found it challenging to integrate cloud data with that stored in-house.

Workplace attitudes also pose a barrier to migration. In a recent survey, over half of respondents claimed that internal reluctance, incoherent IT strategies and other organisational problems created significant issues in their plans to move Big Data initiatives to public cloud.

There are technical issues to overcome too. Particularly data management, security and the above-mentioned integration.

Planning your migration

Before starting your migration, it is important to plan ahead. If you intend to fully move Big Data analyses to the public cloud, the first thing to do is to cease investment in in-house capabilities and focus on developing a strategic plan for your migration, beginning with the projects that are most critical to your business development.

Moving to the cloud also offers scope for you to move forward and improve what you already have. For this reason, don’t plan to make your cloud infrastructure a direct replica of what you have in-house. It is the ideal opportunity to create for the future and build something from the ground up that will provide even more benefits than you currently have. Migration is the chance to redesign your solutions so they can benefit from all the things cloud has to offer: automation, AI, machine learning, etc.

Finally, you need to decide on the type of public cloud service that best fits your current and future needs. Businesses have a range of choices when it comes to cloud-based Big Data services, these include software as a service (SaaS) infrastructure as a service (IaaS) and platform as a service (PaaS); you can even get machine learning as a service (MLaaS). Which level of service you decide to opt for will depend on a range of factors, such as your existing infrastructure, compliance requirements, Big Data software and in-house expertise.

Conclusion

Migrating Big Data analytics to the public cloud offers businesses a raft of benefits: cost savings, scalability, agility, increased processing capabilities, better access to data, improved security and access to technologies such as machine learning and artificial intelligence. Whilst moving does have obstacles that need to be overcome, the advantages of being able to analyse Big Data gives companies a competitive edge right from the outset.

If you are considering migrating to the cloud, check out our range of cloud hosting packages. If you need advice about the best solution for you or help with migrating, call us on 0800 862 0380.
You can even check our dedicated server hosting here.

Why Public Cloud is Best for Big Data

2018-03-05 13:12:55 | Technology

https://blogimg.goo.ne.jp/user_image/7b/eb/9591f7e3a3ae5b252cbcfb14595eb31e.png

If you are one of the growing number of companies using Big Data, this post will explain the benefits and challenges of migrating to the public cloud and why it could be the ideal environment for your operations.

Cloud-based Big Data on the rise

According to a recent survey by Oracle, 80% of companies are planning to migrate their Big Data and analytics operations to the cloud. One of the main factors behind this was the success that these companies have had when dipping their toe into Big Data analytics. Another survey of US companies discovered that over 90% of enterprises had carried out a big data initiative last year and that in 80% of cases, those projects had highly beneficial outcomes.

Most initial trials with Big Data are carried out in-house. However, many of those who find it successful want to expand their Big Data operations and see cloud as a better solution. The reason for this is that the IaaS, PaaS and SaaS solutions offered by cloud vendors are much more cost effective than developing in-house capacity.

One of the issues with in-house, Big Data analyses is that it frequently involves the use of Hadoop. Whilst Apache’s open source software framework has revolutionised storage and Big Data processing, in-house teams find it very challenging to use. As a result, many businesses are turning to cloud vendors who can provide Hadoop expertise as well other data processing options.

The benefits of moving to the public cloud

One of the main reasons for migrating is that public cloud Big Data services provide clients with essential benefits. These include on-demand pricing, access to data stored anywhere, increased flexibility and agility, rapid provisioning and better management.

On top of this, the unparalleled scalability of public cloud means it is ideal for handling Big Data workloads. Businesses can instantly have all the storage and computing resources they need and only pay for what they use. Public cloud can also provide increased security that creates a better environment for compliance.

Software as a service (SaaS) has also made public cloud Big Data migration more appealing. By the end of 2017, almost 80% of enterprises had adopted SaaS, a rise of 17% from 2016, and over half of these use multiple data sources. As the bulk of their data is stored in the cloud, it makes good business sense to analyse it there rather than go through the process of moving back to an in-house datacentre.

The other benefit of public cloud is the decreasing cost of data storage. While many companies might currently think the cost of storing Big Data over a long period is expensive compared to in-house storage, developments in technology are already bringing down the costs and this will continue to happen in the future. At the same time, you will see vast improvements in the public cloud’s ability to process that data in greater volumes and at faster speeds.

Finally, the cloud enables companies to leverage other innovative technologies, such as machine learning, artificial intelligence and serverless analytics. The pace of developments these bring means that those companies who are late adopters of using Big Data in the public cloud find themselves at a competitive disadvantage. By the time they migrate, their competitors are already eating into their market.

The challenge of moving Big Data to public cloud

Migrating huge quantities of data to the public cloud does raise a few obstacles. Integration is one such challenge. A number of enterprises find it difficult to integrate data when it is spread across a range of different sources and others have found it challenging to integrate cloud data with that stored in-house.

Workplace attitudes also pose a barrier to migration. In a recent survey, over half of respondents claimed that internal reluctance, incoherent IT strategies and other organisational problems created significant issues in their plans to move Big Data initiatives to public cloud.

There are technical issues to overcome too. Particularly data management, security and the above-mentioned integration.

Planning your migration

Before starting your migration, it is important to plan ahead. If you intend to fully move Big Data analyses to the public cloud, the first thing to do is to cease investment in in-house capabilities and focus on developing a strategic plan for your migration, beginning with the projects that are most critical to your business development.

Moving to the cloud also offers scope for you to move forward and improve what you already have. For this reason, don’t plan to make your cloud infrastructure a direct replica of what you have in-house. It is the ideal opportunity to create for the future and build something from the ground up that will provide even more benefits than you currently have. Migration is the chance to redesign your solutions so they can benefit from all the things cloud has to offer: automation, AI, machine learning, etc.

Finally, you need to decide on the type of public cloud service that best fits your current and future needs. Businesses have a range of choices when it comes to cloud-based Big Data services, these include software as a service (SaaS) infrastructure as a service (IaaS) and platform as a service (PaaS); you can even get machine learning as a service (MLaaS). Which level of service you decide to opt for will depend on a range of factors, such as your existing infrastructure, compliance requirements, Big Data software and in-house expertise.

Conclusion

Migrating Big Data analytics to the public cloud offers businesses a raft of benefits: cost savings, scalability, agility, increased processing capabilities, better access to data, improved security and access to technologies such as machine learning and artificial intelligence. Whilst moving does have obstacles that need to be overcome, the advantages of being able to analyse Big Data gives companies a competitive edge right from the outset.

If you are considering migrating to the cloud, check out our range of cloud hosting packages. If you need advice about the best solution for you or help with migrating, call us on 0800 862 0380.
You can even check our dedicated server hosting here.

Why Public Cloud is Best for Big Data

2018-03-05 13:08:11 | Technology

https://blogimg.goo.ne.jp/user_image/7b/eb/9591f7e3a3ae5b252cbcfb14595eb31e.png

If you are one of the growing number of companies using Big Data, this post will explain the benefits and challenges of migrating to the public cloud and why it could be the ideal environment for your operations.

Cloud-based Big Data on the rise

According to a recent survey by Oracle, 80% of companies are planning to migrate their Big Data and analytics operations to the cloud. One of the main factors behind this was the success that these companies have had when dipping their toe into Big Data analytics. Another survey of US companies discovered that over 90% of enterprises had carried out a big data initiative last year and that in 80% of cases, those projects had highly beneficial outcomes.

Most initial trials with Big Data are carried out in-house. However, many of those who find it successful want to expand their Big Data operations and see cloud as a better solution. The reason for this is that the IaaS, PaaS and SaaS solutions offered by cloud vendors are much more cost effective than developing in-house capacity.

One of the issues with in-house, Big Data analyses is that it frequently involves the use of Hadoop. Whilst Apache’s open source software framework has revolutionised storage and Big Data processing, in-house teams find it very challenging to use. As a result, many businesses are turning to cloud vendors who can provide Hadoop expertise as well other data processing options.

The benefits of moving to the public cloud

One of the main reasons for migrating is that public cloud Big Data services provide clients with essential benefits. These include on-demand pricing, access to data stored anywhere, increased flexibility and agility, rapid provisioning and better management.

On top of this, the unparalleled scalability of public cloud means it is ideal for handling Big Data workloads. Businesses can instantly have all the storage and computing resources they need and only pay for what they use. Public cloud can also provide increased security that creates a better environment for compliance.

Software as a service (SaaS) has also made public cloud Big Data migration more appealing. By the end of 2017, almost 80% of enterprises had adopted SaaS, a rise of 17% from 2016, and over half of these use multiple data sources. As the bulk of their data is stored in the cloud, it makes good business sense to analyse it there rather than go through the process of moving back to an in-house datacentre.

The other benefit of public cloud is the decreasing cost of data storage. While many companies might currently think the cost of storing Big Data over a long period is expensive compared to in-house storage, developments in technology are already bringing down the costs and this will continue to happen in the future. At the same time, you will see vast improvements in the public cloud’s ability to process that data in greater volumes and at faster speeds.

Finally, the cloud enables companies to leverage other innovative technologies, such as machine learning, artificial intelligence and serverless analytics. The pace of developments these bring means that those companies who are late adopters of using Big Data in the public cloud find themselves at a competitive disadvantage. By the time they migrate, their competitors are already eating into their market.

The challenge of moving Big Data to public cloud

Migrating huge quantities of data to the public cloud does raise a few obstacles. Integration is one such challenge. A number of enterprises find it difficult to integrate data when it is spread across a range of different sources and others have found it challenging to integrate cloud data with that stored in-house.

Workplace attitudes also pose a barrier to migration. In a recent survey, over half of respondents claimed that internal reluctance, incoherent IT strategies and other organisational problems created significant issues in their plans to move Big Data initiatives to public cloud.

There are technical issues to overcome too. Particularly data management, security and the above-mentioned integration.

Planning your migration

Before starting your migration, it is important to plan ahead. If you intend to fully move Big Data analyses to the public cloud, the first thing to do is to cease investment in in-house capabilities and focus on developing a strategic plan for your migration, beginning with the projects that are most critical to your business development.

Moving to the cloud also offers scope for you to move forward and improve what you already have. For this reason, don’t plan to make your cloud infrastructure a direct replica of what you have in-house. It is the ideal opportunity to create for the future and build something from the ground up that will provide even more benefits than you currently have. Migration is the chance to redesign your solutions so they can benefit from all the things cloud has to offer: automation, AI, machine learning, etc.

Finally, you need to decide on the type of public cloud service that best fits your current and future needs. Businesses have a range of choices when it comes to cloud-based Big Data services, these include software as a service (SaaS) infrastructure as a service (IaaS) and platform as a service (PaaS); you can even get machine learning as a service (MLaaS). Which level of service you decide to opt for will depend on a range of factors, such as your existing infrastructure, compliance requirements, Big Data software and in-house expertise.

Conclusion

Migrating Big Data analytics to the public cloud offers businesses a raft of benefits: cost savings, scalability, agility, increased processing capabilities, better access to data, improved security and access to technologies such as machine learning and artificial intelligence. Whilst moving does have obstacles that need to be overcome, the advantages of being able to analyse Big Data gives companies a competitive edge right from the outset.

If you are considering migrating to the cloud, check out our range of cloud hosting packages. If you need advice about the best solution for you or help with migrating, call us on 0800 862 0380.
You can even check our dedicated server hosting here.

Meltdown and Spectre: New Intel CPU Vulnerabilities Discovered

2018-01-15 15:47:11 | Technology
Until this week, Spectre was just the fictitious name of a secret organisation bent on world domination in a James Bond movie. Today, it is something far more real, though equally as sinister –a design flaw in Intel microprocessors that leaves any data in their memory vulnerable to malicious exploitation.



Intel’s flaw means that applications, malware and JavaScript that run in web browsers can access the contents of your operating system’s private memory areas and steal your credentials and personal information. What’s more, on shared systems, such as a public cloud server, there’s a possibility for some software to use the machine’s physical memory as a way to access data stored on other customers’ virtual machines.

Unfortunately, Intel is not the only CPU at risk here. So are processors from Arm and AMD. AMD admits its CPUs are vulnerable in ‘some situations’ whereas Arm has published a list of affected cores, most of which are in mobile devices.

The cause of the vulnerability is due to the way modern, high-speed CPUs work. In order to operate faster, these chips attempt to guess which instructions they will be given next. In doing this, they fetch the data necessary to carry these instructions out. This data is known as speculative code. If the CPU makes the wrong guess, it has to remove the speculative code and call the code which is actually required instead.

Unfortunately, one of the issues with these chips is that they do not completely remove the remnants of speculative code, and so parts of it remain stored in temporary caches where it can be accessed at a later time. The problem here is that cleverly coded malware can make it possible to discover that information from the contents of the CPU’s kernel memory.

It has now been discovered that there are two main vulnerabilities around the exploitation of speculative code in modern CPUs – these have been named as Meltdown and Spectre.

Meltdown

Meltdown can be employed by normal computer applications to read the contents of private kernel memory. The vulnerability has an enormous reach, potentially affecting all ‘out-of-order execution’ Intel processors that have been manufactured since 1995, with the exception of Itanium and Atom (built before 2013).

If you have these chips installed, there are now temporary workaround patches available for Apple, Windows and Linux. You should update your operating system without delay. The patches work by moving the operating system kernel to a separate virtual memory. Although this improves security, those with high processing demands may notice an effect on performance, depending on the CPU model and the software being run.

Whilst Meltdown does not affect AMD processors, users of other brands besides Intel may be affected.

Important note for Windows users

Microsoft has released a patch to block the Meltdown vulnerability. However, before installing, users and administrators should check that their antivirus software is compatible with the patch. Failure to do so could result in the blue screen of death. This is because the antivirus package needs to update a registry key before installation can occur. For this reason, Microsoft has set the update to apply only when the registry key has been changed.

Spectre

Spectre enables user-mode applications to extract data from their own processes and others being run on the same system, for example, it could extract all the login cookies stored in a browser’s memory. One of the problems with Spectre is that it is a difficult vulnerability to patch – at present, there are no solid fixes available for either Intel or AMD CPUs. Luckily, the vulnerability is equally as challenging to exploit.

With regard to protecting yourself against a Spectre attack, the advice at present is to:

install any operating system and firmware security updates as soon as possible
do not run any untrusted code
consider turning on Chrome’s site isolation to prevent malicious websites from attempting to steal session cookies
Xen hypervisor users should install security patches when they are available.
VMware’s ESXi, Workstation and Fusion hypervisors also need patching
Information for eUKhost Customers

Staff at eUKhost are keeping fully up-to-date with developments regarding Spectre and Meltdown. As part of our managed services, we are installing patches as soon as they become available from vendors and open source maintainers, just as we do with all other security issues. As you would expect, we have a highly vigilant monitoring system in operation together with a range of other effective security measures.

The patches currently being released are a temporary workaround until Intel and the other processor manufacturers find a permanent solution. This is expected within the next week.

If you are an eUKhost customer and are concerned about the security of your system, please contact our 24/7 customer support team.

How to Stop Your Website Going Offline

2017-09-23 10:23:47 | Technology
If you have a website, it is important that it is always available. The last thing you need is for visitors to discover your site has gone down, especially if it is a critical element of your business. In this post, we’ll look at why websites can suffer downtime and what you can do to prevent it happening. We’ll also show you what to do if your website does go offline.



Reasons why websites can go offline
There are numerous reasons why a website can go offline: it can be caused by everything from natural disasters, such as floods at your web host’s data centre, to you accidentally putting your site into maintenance mode. Some causes are more common than others, however, and in this next section, we’ll look at the ones which are most likely to affect your site.

1. Scheduled server maintenance

Like any other computer, the server on which your website is hosted needs to be looked after. From time to time, your web host will need to update software, install security patches and upgrade hardware. Whilst much of this can be done while the server is still in operation, occasionally, this will mean the server needs to be temporarily taken offline or rebooted. When this happens, your website may be unavailable (though there are some forms of hosting where this is not necessary).
Web hosts are aware of how this can affect customers and they undertake their maintenance at times which are least likely to cause disruption to your business. For example, they will avoid times of the day when web traffic is the busiest. However, if your web host is based in another country, the time differences can mean this is less convenient.

2. Server overload

Sometimes websites can go down because the server on which they are hosted cannot handle the number of processes taking place. One cause of this is the DDoS attack where a hacker will flood a server with so many traffic requests that it goes offline. It can also happen on a shared server where one of the websites being hosted receives so much traffic that the other sites suffer performance problems as a result. It can also happen if something on your website goes viral and you suddenly get unexpectedly high volumes of visitors all trying to reach your site at the same time.
If you use shared hosting, make sure your web host puts measures in place to prevent other user’s websites usurping all the server resources. If you find that the amount of traffic you receive is regularly taking your site offline, this can be a sign that you have a very popular website and that you need to upgrade to a larger hosting package to handle all your traffic.


3. Coding errors

A common cause of downtime is due to coding errors on your website. Whilst individual pieces of software are usually error free, sometimes when you run them together they may cause a conflict. For example, if you run a WordPress website, you may find that two separate plugins are incompatible. Each may work perfectly when the other isn’t installed but when both are installed your site may go offline. If this happens, you may need to find an alternative plugin.
Another coding error happens when people tinker with the coding on their site without really knowing what they are doing. Lots of people do this, particularly with CMS website software like WordPress. If you intend to tinker with the coding, always make sure you have a backup so that, if the worst happens, you can get your site back online quickly.

4. Hacking

Besides the DDoS attacks we mentioned earlier, there are other forms of hacking that can take your site offline. If a hacker gets access to your cPanel or your server area, they can take your site down by deleting or tampering with files. Alternatively, they can redirect your visitors to other websites, so whilst it might look like your site is online, everyone who tries to visit it ends up on a different and often malicious website. To avoid this, always use strong login passwords and keep your site secure.
5. Poor hosting
If you find that your website goes offline on a regular basis, it could be that you have opted to go with a poor-quality web host. They could be using outdated hardware, cramming too many users on to each server or simply not monitoring how well their servers are performing. If this is the case, you need to migrate your site to a different and more reliable web host.

How to stop your website going offline

1. Find the right host

The single most important factor in keeping your website online is choosing the right host. A good host will provide a wide range of services to ensure that uptime is guaranteed, these include:

Use of fast, high-performance servers
Configuring servers for optimal website performance
Use of backup servers (or data centres) if a server goes down
Monitoring server performance so issues are dealt with before a server fails
Monitoring for hacking and malware
Managed updates and patching to operating system software

In addition, a good host will also provide services to help you look after your website, such as remote backups, 24/7 technical support and site monitoring. Some hosts will also offer you guaranteed uptime. Depending on the type of hosting package you choose, this can be between 99.95% and 100% uptime, backed up by a service level agreement (SLA).

3. Keep your software up to date

To prevent your site going offline because of hacking, it is important to keep your site software up to date at all times. Cybercriminals look for vulnerabilities in outdated CMS, theme and plugin software and target those sites where those vulnerabilities are found. The best way to avoid this, is to set up automatic updates; this way, you won’t need to worry about updating them yourself.
4. Monitor your website
Monitoring services provided by web hosts or through installed plugins can alert you if your site goes offline or if there are issues you need to take action on. They will also create error logs which you can use to diagnose why your site went offline and help you restore service quickly.

What to do if your site goes offline

1. Make sure it is actually offline

If you cannot reach your website, make sure, first of all, that this is not due to the device you are searching on, the connection you are using or your browser. If you can, get someone else in a different location to verify the status for you.

2. Find the cause of your downtime

When you attempt to visit your site, you may get clues to the cause. Often the browser will report a reason for a failed connection. If there is a coding error, you may see the error code displayed instead of the actual web page. If there is heavy traffic, the site may take a very long time to load and may only partially show the contents.
If these attempts to find a cause fail, you may need to log in to your cPanel and look at any error logs. If you cannot log in to your cPanel, then it is likely to be an error at your hosting company’s end.

3. Contact technical support

If you have chosen a web host with 24/7 technical support, get in touch with them straight away. Inform them of your problem and ask them for assistance. Often, what looks like a major problem to you, may be something they can solve quite easily.

4. Let your customers know

If your site goes down, you need to let your customers know about it. If you can still send email, send out a message to subscribers informing them that the site is currently unavailable and that service will resume shortly. You can also do this on social media, for example, you can post a status update on your Facebook page or send out a tweet. You should also do these things once normal service is resumed.
5. Disaster recovery planning
If your website is a critical element of your business, ideally, you should have a disaster recovery plan in place to deal with getting your site back online as quickly as possible. For more information read our post ‘10 Tips for an Effective Disaster Recovery Plan’.

Conclusion
Hopefully, from reading this article, you should have a clearer understanding of why a website can go offline, what you can do to prevent it and what you need to do if your site does go down.
For web hosting with exceptional reliability, including packages with 100% uptime, guaranteed by SLA, visit our homepage to see our wide range of hosting plans.