HOWTO Get Free Official AWS Practice Exam Voucher Codes!

I’ll keep this post really brief, but you could blow me down with a feather when I found out that it is possible to get completely free and official practice exam voucher codes from Kryterion!

The only proviso is that you need to already have passed at least one exam. It seems that one key benefit to being AWS certified is that they will provide you with a free exam voucher code for the next exam you do, which could be used towards either Associate or Professional level exams! How great is that?

For me personally, the practice exams are a brilliant (if not the best) way to prepare for the AWS exams. I always use them in the last couple of days as a way to check my progress, build confidence before the exams, and weed out / fill in any gaps in my knowledge.

Free Official AWS Practice Exams

So… how do I get these free official AWS practice exam voucher codes then?

It’s very simple! To get the free official practice exam voucher code, follow the steps below!

  1. Log into your certification account.
  2. Click Redeem My Benefits.Redeem My Benefits
  3. Select one of the lines from an existing exam and click Claim Benefit.claim
  4. It will then provide you with an exam voucher code for free!
  5. Then just go to Schedule and Manage Your Exams.
  6. Click Register for an Exam.
  7. Expand +AWS Practice Exams.AWS practice exams
  8. Add your preferred test practice to your basket.SA pro practice
  9. Enter your code into the Coupon / Voucher Code field.
  10. Click Continue.
  11. You’re done! Now just click Launch!

There we go, easy peasy! Even better, this is repeatable for as many exams as you have done, so each time you pass an exam, you will get another voucher!

Thanks very much AWS, what a great perk!

Does Cloud Provide Infinite Storage Capacity and Retention?

cloud

I wrote last week about the challenges of long-term retention of data, and some of the architectural considerations and decisions we take in designing long-term backup or archive solutions. The follow-up question therefore is, does the cloud provide infinite storage capacity and retention?

“Cloud Integration”

One of the key themes which I have been seeing of late with many (if not all!) modern storage solutions, is some form of cloud integration. It seems to me that many vendors are trying to ensure they can tick the “cloud integration” check box in an RFP or RFI!

I recall one time at a previous organisation, our storage team did an RFP asking for an array which was capable of doing file presentation. The response in the RFP was “Yes”, but when this was dug into a bit further (after the fact), it turned out that this was only possible with an HA pair of custom vendor file gateways. In other words, not much better than building your own file server!

Anyway back to the point, this “RFP checkbox” mentality means that some vendors have a very tight cloud integration with multiple target replication options (such as DC to DC, DC to Cloud, Cloud to DC, Cloud to Cloud, etc), whilst others provide little more than lip service to cloud integration.

The best suggestion I can make in this scenario is to push your vendor for either a demo, a PoC, or a software copy of their array, if they have one. That way you can be absolutely sure that what is claimed, is indeed what you are looking for!

One Possible Solution… EMC Unity

One solution I believe falls more and more into the cloudy camp with each code release, is the new EMC Unity arrays, for which we were provided a briefing at the recent Storage Field Day 13 event.

What I found particularly interesting was that the arrays were natively capable of up to 256 redirect-on-write snapshots per volume, which sounds like a lot, but if you do one every 5 minutes then you will run out pretty fast! By utilising the EMC Cloud Tiering Appliance (a totally separate management interface today, which I really hope EMC fully integrate into the Unity pretty quick, as multiple panes of glass are no fun for anyone!), we can utilise any S3-compatible storage to provide UNLIMITED snapshots.unlimited snapshot retention

This is pretty cool if you have to provide very granular restoration points for your application data, as well as the ability to off-site at relatively low cost in a near-infinite data storage facility!

Sadly, you can’t currently run VMs directly from those snapshots in the cloud, but bearing in mind that EMC has a software only version of the Unity already available, I have a sneaking suspicion that there will be some engineering talent working on this as we speak! This would potentially provide the ability to snap and replicate your entire estate natively to S3 buckets in the cloud, then restore very quickly locally within that IaaS platform. Let’s hope I’m right!

Want to Know More?

EMC’s sessions on Unity, Scale-IO and Isilon were recorded and are now available to stream online:

Some of the other SFD13 delegates had their own thoughts on the session and EMC in general. You can find them here:

Disclaimer/Disclosure: My flights, accommodation, meals, etc, at Storage Field Day 13 were provided by Tech Field Day / Gestalt IT, but there was no expectation or request for me to write about any of the vendors products or services and I was not compensated in any way for my time at the event.

Long Term Data Retention – What do I do?

One of the more common requirements I come across on a day to day basis working with organisations across a broad spectrum of industries is the question of how to manage long-term data retention.

Frankly, I have massively oversimplified the question as there are many more nuances to it that this! Some of the questions, discussion points and potential solutions I see when trying to scope out and define a long-term data retention strategy are below. We assume in this case that we are talking about backing up application data, but the same can apply to file data, such as from a file server.

Long Term Data Retention – Questions, questions, questions?!

Like beautiful snowflakes, ultimately it always comes back to gathering the requirements for the individual business.

What are the regulatory and compliance requirements for long-term retention of data, and what are the consequences for loss of that data? In the new world, this could be pretty serious, especially with things like GDPR right around the corner. Escalating this up the business hierarchy can get buy in from other parts of the business to provide additional budget outside of IT, for a solution to meet the actual requirements, not just a botch job which will likely fail when put to the test.

How long is the actual data retention required? Looking at most current applications, if we are relying on being able to read back data in 7 years, current or future backup software may still work, but will we have the kit to read the tapes or data? If using spinning rust as a storage media, do we expect to be able to migrate data from one disk system to another easily in future, and if so, how does that impact things like encryption, capacity, deduplication and compression of that data?

What is it that we are trying to protect against? Deliberate or accidental deletion, total destruction of a server, array or DC, or perhaps we just need to be able to prove what your data looked like at a specific date / time.

How granular does the data need to be? For example do we need to be able to pull a file version from a specific week in the past X years? The more granular we need to get, potentially the more expensive. If we have controls in place to protect archive data against accidental / deliberate deletion, then we may not actually need to keep more than a few days or weeks of backups (as an example).

The use of FIM (File Integrity Management) tooling can be very helpful in this regard, especially for flat file structures. They can track all changes to your file system and if something is removed or updated, you could alert your server teams to investigate why and restore it from a recent backup.

Can the application or server prevent deliberate or accidental data deletion? If the application can be treated as, or write to, WORM storage (Write Once Read Many times), then the risk of data loss is further reduced, especially if that storage can be replicated off site. This doesn’t really help much with things like SQL databases, however!

Where is the archive data for the application or solution actually held? Is it within the live system (e.g. the live DB), or can it be exported onto a tertiary archive system where it becomes Read Only to all parties, including administrators? Even better, can the application export the data into a generic format, more likely to be readable in 25+ years time (such as CSV, text etc)? This provides quite a bit more flexibility in terms of future access and recovery options.

Does the application or server provide RBAC, and has it actually been implemented yet? If we minimise the number of people who could update or delete data (maliciously or accidentally), we minimise the risk of data loss.

What is the budget for the solution? All singing, all dancing, physical or software solutions can be great, but you may not be able to afford them.

Are we looking for an appliance-based solution which includes storage, replication, backup plugins, etc, or do you already have the HW and just need some software? This often, but not always, comes down to a time vs budget question. Do you want to spend your team’s time managing clunky backup software, or just buying an appliance which does half the work for you and is policy based?

What are your sovereignty requirements for the data, and would a cloud-based service be appropriate for your business? It can be very cheap to store data in something like S3 or blob storage, if the business accepts this and you don’t need to pull any of the data back very often (if at all).

How quickly is the data required when requested, how large is a typical access request, and how often are they needed? If this can be hours or days, then an offline or cloud solution may be appropriate, but anything where immediate access is required, is a different story.

Similarly, will we want to restore or access this data in the event of a DR, does this solution form part of our DR strategy? Perhaps it’s only required for access to much older data because you are replicating the most recent data to a DR facility!

As we can see, there are many, many, [many!] things to think about when considering long-term retention of data in a backup or archive solution.

What brought this up Alex?…

… I hear you ask!

I recently attended Storage Field Day 13, where we had a presentation from a backup vendor, StorageCraft, who has been in the SMB and mid-market space for many years, and it got me thinking!

The latest iteration of their backup software provides a local cache with cloud integration, and the added ability to spin up a DR environment in the event of an outage to your primary DC. A pretty nifty feature if you are legally able to store your data outside of your local environment (they currently have DCs in the US and EU only).

They can also create backups using their proprietary SPF file format, which has apparently not changed since its inception around 15 years ago. There is also no concept of a media server, as each server manages its own backups (albeit with the ability to use a central scheduler tool). This gets around the issue of backup compatibility, though may limit their ability to provide additional data services for the backup files, such as encryption, dedupe or compression, outside that of the storage targets they reside on.

This is what tickled my mental matrix into deploying my keyboard! 🙂

Want to Know More?

The session was recorded and is now available to stream online:

StorageCraft Presents at Storage Field Day 13

Some of the other SFD13 delegates had their own thoughts on the session and StorageCraft in general. You can find them here:

Dan Frith – StorageCraft Are In Your Data Centre And In The Cloud

Scott Lowe – Backup and Recovery in the Cloud: Simplification is Actually Really Hard

Disclaimer/Disclosure: My flights, accommodation, meals, etc, at Storage Field Day 13 were provided by Tech Field Day / Gestalt IT, but there was no expectation or request for me to write about any of the vendors products or services and I was not compensated in any way for my time at the event.

vBlog 2017 – Top Virtualisation & Storage Blogs

I’ll keep this post about vBlog 2017 very brief as you can see my thoughts on the subject of soliciting votes for awards in my post from 2015!

It’s that time of year again when Eric Siebert of vSphere Land and vLaunchpad runs his annual Top 100 vBlog nominations!

There are a huge number of bloggers around the world producing great documentation and insight, as well as podcasters helping you pass your daily commute in a constructive and educational fashion! Eric’s awards give people the opportunity to recognise those who really stand out from the crowd, as well as more up and coming bloggers / podcasts.

vBlog 2017 sounds great! How do I vote?

I would encourage you to head over to Eric’s site and cast your votes; it only takes take a few seconds of your time to show some appreciation for the time and effort put in by those ladies and gentlemen who worked tirelessly throughout the year to help make all of our jobs that little bit easier.

Of course, if you do feel like throwing a vote for the Open TechCast podcast and / or Tekhead.it, then it would of course be much appreciated! 😀

Direct link to the voting is also here:
http://topvblog2017.questionpro.com/

vBLog 2017 Awards

%d bloggers like this: