WordPress Backups for Beginners and Advanced Users

Backup MemeWe all know that backing up data is important. Whether it’s a corporate Windows file server, or our treasured family photos, we make sure that we can recover our data in the case of a hardware failure. Oddly enough, most folks tend to skip over their website data when considering their backup strategy.

Although WordPress is the most used CMS in the world, many users still struggle to find a good backup solution. Thankfully, using a combination of Cloud-A’s Bulk Storage and the popular Updraft Plus WordPress Backups plugin, automatically backing up and restoring your website is extremely easy and cost effective. This makes it an ideal solution for WordPress users of any skill level.

Read more

Cloud-A @ OpenStack Summit – A Week in Review

summit collage (1)

As some of our followers might know, the Cloud-A team spent last week in Vancouver for the OpenStack Summit. It was our first summit, and it was nice for it to be in our home country. The summit provided us with the opportunity to interact first hand with our community, listen to keynotes about the direction of our technology, input from industry leaders, and vendors about new and emerging technology in the OpenStack ecosystem. All in all, there was some very familiar messaging about how OpenStack has grown past infancy and is enterprise ready – something that we at Cloud-A have not doubted. To be perfectly honest we don’t feel that there was much announced that was in the category of groundbreaking, but still we wanted to highlight some of the themes and news from last weeks OpenStack summit in terms of future value for our customers.

Read more

Disaster Recovery For Those Concerned About DR In The Cloud

Dr-plan
Although there are some who would say the cloud is redundant to the point that you don’t need to consider a Disaster Recovery Plan (DRP) for it, many of our clients are more comfortable because we offer them a simple and easy ways to protect themselves in this regard. This approach is relatively unique in that we ensure our clients data is always easily accessible to them as a result of our commitment to design to avoid client data lock-in.

Almost all of our customers at Cloud-A have the need to manage backups, archival and disaster recovery data in a variety of different ways. This speaks to the diverse nature of our customers who span a wide array of industries including healthcare, financial services, higher education as well as the government sector. Each customer has their own internal requirements for data protection in addition to the requirements of their clients, and relevant governing bodies. Some customers combine file level backup solutions with periodic snapshots so that they have point-in-time instances of their systems to recover to. Here are some of the more popular backup solutions we’ve seen deployed on Cloud-A so far:

Read more

Time to Spring Clean your SAN – Bulk Storage for Data Archival

broom2
It is no secret that the amount of the data everywhere is growing. Terms like IoT, Big Data and M2M have a certain hype cycle around them, but at the end of the day they are all very relevant concepts. The requirements for data storage are growing at over 50% per year and IDC and EMC predict that the “digital universe” will amount to over 40,000EB, or 5,200GB per person on the planet, by 2020 – the majority of which is unstructured data.

Data Management and Internal Cost per GB

One of the biggest struggles for organizations in the coming years will be to manage these large, growing pools of data. Most enterprises utilize Storage Area Network (SAN technology) for their primary internal storage requirements, which leverage super fast storage media (SSD, Flash) and fibre channel networking to offer lightning fast IOPS and redundancy. The problem with SAN technology is that it is expensive. Every organization will have a different cost per gigabyte of storage, as it depends on many factors including: the type of SAN (FC/iSCSI,) the manufacturer and model, Capacity licensing , Support requirements and Type/capacity/speed of disks supported
Read more

Continuous Integration and Cloud-A

con·tin·u·ous in·te·gra·tion

k?n?tinyo?o?s/ ?in(t)???r?SH(?)n/

Continuous Integration (CI) is a development practice that requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.

Business Benefits of CI

CI Reducing Risk

When you integrate code frequently, you reduce the risk level of any project, though it is essential to have metrics in place to measure the health of the application so that defects in code can be detected sooner and fixed quicker. Frequent integration also means that the gap between the application’s current state and the state of the application in development is smaller.
Read more

CANARIE DAIR Program is helping Canadian business adopt next generation cloud technology

It’s great to see so much positive change happening in the Canadian landscape with respect to cloud education and adoption. We believe that is due in no small part to the hard work and innovative vision that organizations like CANARIE have for the future of high tech networking and infrastructure in Canada. The DAIR program, which was launched in 2013 is a great example of that (http://www.canarie.ca/cloud/), it has provided Canadian entrepreneurs and small businesses with free cloud-based compute and storage resources that help speed up time to market by enabling rapid and scalable product design, prototyping, validation and demonstration. Since its inception, the DAIR program enabled many new Canadian startups to benefit from the scale, speed and agility of cloud technologies to transform their business processes and get to market faster. Read more

Cloud-A at the Atlantic Security Conference

This week the Cloud-A team will be at the 5th annual Atlantic Security Conference in Halifax, Nova Scotia. The Atlantic Security Conference was recently listed as one of The Top 50 Must-Attend Information Security Conferences in the world by Digital Guardian, and we are looking forward to interacting with Atlantic Canada’s infosec community, representing the OpenStack community and promoting the use of true cloud computing.

This year Cloud-A is offering a free one hour DevOps consulting session for any organization who attends AtlSecCon and signs up for a Cloud-A account.

Stop by our table and say Hi! For anyone unable to make it, check our twitter account (@CDNCloudA) for live AtlSecCon updates.

How Canadian Government & Crown Corps are using Cloud-A

ParliamentHillWe’ve seen a fair amount of growth in the government sector recently and as result thought it would be a good idea to help describe how our clients are leveraging the first Public Cloud that was founded in Canada by Canadians. We’ve seen some great steps recently from the Canadian Government to facilitate Shared Services standards that envision inclusion and usage of public Cloud providers like Cloud A in Government infrastructure. Recent Canadian Government documents like the “IT Shared Services Security Domain & Zones Architecture” specify the standards & best practice guidelines so that in the future shared ITC services can be transposable for the use of similar shared services offered through a public cloud provider under contract to the GC.
Read more

Migrating from the Heroku PaaS to Cloud 66 & Cloud-A

cloud66

We come across clients all the time who have their applications built on Heroku, but have requirements from their clients to have their data resident in Canada. This was one of the drivers for creating the relationship we have with Cloud66. Here is a guide for migrating from Heroku to Cloud66 + Cloud-A.

About migrating from Heroku

Migrating your application from Heroku to Cloud 66 involves deploying your code, importing your data and redirecting your traffic to the new endpoint.

What server size do I need?

Using Heroku, you can choose between 1X (512 MB), 2X (1 GB) and PX (6 GB) server sizes. This makes it easy to calculate your server requirements, and we recommend that you use similar server resources when deploying your stack with Cloud 66. We also recommend that you have a separate server for your database in production environments.

Heroku-f7903473

Migrating

1. Code

Simply provide Cloud 66 the URL to your Git repository so that it can be analyzed. For more information, seeAccessing your Git repository.

2. Data

Once your code is deployed, it’s time to migrate your data across. The process differs for PostgreSQL and MySQL databases:

PostgreSQL

From your Heroku toolbelt, create a database backup URL by running heroku pgbackups:url. Next, visit your stack detail page and click the Import Heroku data link. Paste the URL provided by the toolbelt into the field, and click Import Heroku data.

MySQL

Start by dumping your existing database. Refer to the ClearDB documentation for common problems.

$ mysqldump -u [username] -p[password] [dbname] > backup.sql

Once you have a MySQL dump file, use the Cloud 66 toolbelt to upload the file to your stack database server. Remember to replace the fields below with your values.

$ cx upload -s "[stack_name]" [database_server_name] backup.sql /tmp/backup.sql

Next, use the toolbelt to SSH to your server.

$ cx ssh -s "[stack_name]" [server_first_name]

Finally, use the command below to import your backup into the database. You can find the generated username, password and database name by visting your stack detail page and clicking into your database server (eg. MySQL server).

$ mysql -u [generated_user_name] -p [generated_password] "[database_name]" < /tmp/backupfile.sql

3. Traffic

Once you’re ready to serve traffic from your Cloud 66 stack, you need to redirect your traffic to it. For more information, see Configure your DNS.

Useful pointers

Web server and Procfile

By default, Cloud 66 will deploy your stack with Phusion Passenger, but you can also choose acustom web server like Unicorn. You may have a web entry in your Procfile to do this on Heroku. Cloud 66 ignores this entry to avoid compatability issues.

To run a custom web server, we require a custom_web entry. It is important to set this before analyzing your stack, to avoid building the stack with Passenger.

You can also use the Procfile to define other background jobs.

Dyno recycling

Heroku restarts all dynos at 24 hours of uptime, which may conceal possible memory leaks in your application. When you migrate to Cloud 66, these will become noticeable because we don’t restart your workers (other than during a deployment), so the leak can grow to be bigger. A temporary solution is to re-create the Heroku restart behavior, for example with this script:

for OUTPUT in $(pgrep -f sidekiq); do kill -TERM $OUTPUT; done

This will send a TERM signal to any Sidekiq workers, giving them 10 seconds (by default) to finish gracefully. Any workers that don’t finish within this time period are forcefully terminated and their messages are sent back to Redis for future processing. You can customize this script to fit your needs, and add it to your stack as a shell add-in.

Note that this is a temporary solution, and we recommend that you use a server monitoring solution to identify the source of your leak.

Asset Pipeline Compilation

If you haven’t compiled assets locally, Heroku will attempt to run the assets:precompile task during slug compilation. Cloud 66 allows you to specify whether or not to run this during deployment.

Making your App Canadian Data Resident

Probably the simplest step in the whole process is ensuring that your application is Canadian data resident. You will need to create an account with Cloud-A. Then when you set up your stack on Cloud66 ensure that you select Cloud-A as your cloud provider of choice.

Ready to give it a try?

Get started with a $10 credit

Create my account!

The Great Canadian Cloud Migration

EXHIBITION

USD = .30 CAD = 30% more Cloud-A Infrastructure

Several months ago we announced $50 in free Cloud-A account credit for anyone migrating to Cloud-A from an American cloud provider. The promotion was a HUGE SUCCESS and we are bringing it back!

Today $1 USD = $1.30 CAD, which means that by choosing Cloud-A as your cloud infrastructure provider, you are receiving a 30% discount when compared to American providers, not including the free bandwidth, virtual private cloud and Windows licensing you get with Cloud-A

We have had more users convert over from American cloud providers this month than we could have ever imagined, and these users are enjoying over 50% cost savings on average as a result (not including the free credit.)

TIP: When shopping around, keep an eye out for Canadian “cloud” providers who charge for their IaaS in USD. They exist.