Microsoft Azure Architect 70-534 Exam Experience and Tips

Azure Architect 70-534

The information below covers my Microsoft Azure Architect 70-534 Exam experience. Following this I will post a list of my study materials, so keep checking back for updates!

One real positive for me when taking this exam was that I realised if you have an MCSA 2012, you do not need to take another Azure exam to achieve the MCSE title. Handy, especially as I have been pretty vocal about my thoughts on re-certification for versioned exams!

Microsoft Azure Architect 70-534 Exam Experience

Almost everything I read in the run up to taking the Azure Architect 70-534 exam, suggested that it was going to be pretty tricky. Many people suggested to me it was harder than typical MS exams. For those of us who are already a bit cloudy, harder than the AWS SA Associate exam but easier than the SA Pro.

My personal experience (having done both) was that it was a little harder than the AWS SA Pro exam, mainly in prep time and breadth of information, but and the reputation was perhaps a wee bit overblown. Don’t get me wrong, it was definitely tricky, but I have a sneaking suspicion that they may have dumbed it down a little in the past few months, as my experience did not quite match that of those who came before me!

tricksy Azure 70-534

The scoring methodology was WAY better than many other exams I have taken in the past (including from Microsoft). When you have a multi-part answer (e.g. choose 3 of 5, etc), then for each correct PART you get a point. In other exams, one wrong selection means “nil points”! In the 70-534 exam, I could have got one wrong selection in every multi-part answer, and still walked away with half or more of the points, which is AWESOME! This really took the pressure off!

The exam is very similarly formatted to most other MS exams, with a couple of notable exceptions. There is a section with standard multi-part, ordering, drag/drop, multi-choice as you would expect. Once this is completed (or perhaps before?), then you do a number of case studies. Note: Once you complete each case study, you cannot go back to it, however, the timing for the case studies was cumulative, so you don’t have to worry if one takes you a bit longer than another.

The number of questions I had in my exam left me with plenty of time, vs some of my colleagues who have done it in the past as well as since, who had 50% or more questions and case studies than me (I had 39 questions spread across all sections of the exam). I can only suggest that perhaps there have been some changes of late which mean you may or may not end up with more time per question.

It’s also worth noting that one or two of the questions I received were based on ASM (i.e. classic) instead of ARM! Not enough that it would be worth learning ASM, but don’t be surprised if something does come up.

legacy azure cloud asm classic mode

Exam Tips and Advice

Here are a few tried and tested tips for most exams as well as specific to the 70-534 exam (based on my experience):

  • Flip through the case study questions as you get to each one to get an idea of the kinds of questions being asked (e.g. security, authentication, networking, etc) so that you can bear these in mind as you read the case study.
  • Don’t worry too much about the clock, they give you plenty of time, especially as there is no specific time limit on the individual case studies (I think there may have been in the past?). For around the number of questions you are likely to get, this is loads of time.
  • Personal opinion: Old questions are dead to me! What I mean by that is that I don’t mark questions for review and once I click Next I never, ever, ever, ever, [ever!] go back. Chances are if I wasn’t sure about an answer and I go with my gut, it’s more likely to be right. If I sit there paralysed with indecision, I just waste time (or worse, potentially change a correct answer to an incorrect one!). By the time I hit the end of an exam I generally have a feeling whether I have passed or not, so going back to get a couple of extra points is a waste of time and I am just desperate to see the result! 🙂
    The one and only contradiction to this rule is if I come across a later question which immediately triggers me remembering something, or even blatantly answers a previous question by asking another. These are as rare as hen’s teeth though!
  • Finally, this may sound a bit cryptic, but I can’t go into any detail obviously due to NDA. All I can say is don’t get weirded out by what seems like an odd handful of questions at the start of the 70-534 exam. I got some which didn’t make sense to me at all until the end of the series (which doesn’t allow you to go back). I can’t go into more detail than that, but hopefully this preps you more than me, so you are not as surprised!
Architect Grumpiness

I do have one complaint about this exam which I will therapeutically air publicly now; why on earth as an “Architect” exam should anyone have to memorise the thousands of possible combinations of PowerShell commands, or indeed any commands whatsoever?! Fortunately, the percentage of the exam weighted towards this is small, but it is ridiculous IMO. 532/533, yes! 534? Stupid!

There also seems to be a key focus on understanding the exact specs of exact machine types. IMO this is also dumb as with any cloud platform you simply pull up your machine list and match the right machine at the time. Wasting time memorising the spec of every A-series, D-Series, etc machine is completely pointless, but is unfortunately required reading (at least as a minimum to remember the key “odd” ones, such as which provide RDMA).

powershelgl azure 70-534 exam tips

Anyway, all in all, a reasonably fair exam across a broad and relatively deep set of information and services. Best of luck to you, and if you found this article useful please leave a comment below! 🙂

Want to Learn More?

Part 2 of this article, my 70-534 exam study guide and all of my 70-534 study materials is available here:

Microsoft Azure Architect 70-534 Exam Experience and Tips

HOWTO Get Free Official AWS Practice Exam Voucher Codes!

I’ll keep this post really brief, but you could blow me down with a feather when I found out that it is possible to get completely free and official practice exam voucher codes from Kryterion!

The only proviso is that you need to already have passed at least one exam. It seems that one key benefit to being AWS certified is that they will provide you with a free exam voucher code for the next exam you do, which could be used towards either Associate or Professional level exams! How great is that?

For me personally, the practice exams are a brilliant (if not the best) way to prepare for the AWS exams. I always use them in the last couple of days as a way to check my progress, build confidence before the exams, and weed out / fill in any gaps in my knowledge.

Free Official AWS Practice Exams

So… how do I get these free official AWS practice exam voucher codes then?

It’s very simple! To get the free official practice exam voucher code, follow the steps below!

  1. Log into your certification account.
  2. Click Redeem My Benefits.Redeem My Benefits
  3. Select one of the lines from an existing exam and click Claim Benefit.claim
  4. It will then provide you with an exam voucher code for free!
  5. Then just go to Schedule and Manage Your Exams.
  6. Click Register for an Exam.
  7. Expand +AWS Practice Exams.AWS practice exams
  8. Add your preferred test practice to your basket.SA pro practice
  9. Enter your code into the Coupon / Voucher Code field.
  10. Click Continue.
  11. You’re done! Now just click Launch!

There we go, easy peasy! Even better, this is repeatable for as many exams as you have done, so each time you pass an exam, you will get another voucher!

Thanks very much AWS, what a great perk!

Does Cloud Provide Infinite Storage Capacity and Retention?

cloud

I wrote last week about the challenges of long-term retention of data, and some of the architectural considerations and decisions we take in designing long-term backup or archive solutions. The follow-up question therefore is, does the cloud provide infinite storage capacity and retention?

“Cloud Integration”

One of the key themes which I have been seeing of late with many (if not all!) modern storage solutions, is some form of cloud integration. It seems to me that many vendors are trying to ensure they can tick the “cloud integration” check box in an RFP or RFI!

I recall one time at a previous organisation, our storage team did an RFP asking for an array which was capable of doing file presentation. The response in the RFP was “Yes”, but when this was dug into a bit further (after the fact), it turned out that this was only possible with an HA pair of custom vendor file gateways. In other words, not much better than building your own file server!

Anyway back to the point, this “RFP checkbox” mentality means that some vendors have a very tight cloud integration with multiple target replication options (such as DC to DC, DC to Cloud, Cloud to DC, Cloud to Cloud, etc), whilst others provide little more than lip service to cloud integration.

The best suggestion I can make in this scenario is to push your vendor for either a demo, a PoC, or a software copy of their array, if they have one. That way you can be absolutely sure that what is claimed, is indeed what you are looking for!

One Possible Solution… EMC Unity

One solution I believe falls more and more into the cloudy camp with each code release, is the new EMC Unity arrays, for which we were provided a briefing at the recent Storage Field Day 13 event.

What I found particularly interesting was that the arrays were natively capable of up to 256 redirect-on-write snapshots per volume, which sounds like a lot, but if you do one every 5 minutes then you will run out pretty fast! By utilising the EMC Cloud Tiering Appliance (a totally separate management interface today, which I really hope EMC fully integrate into the Unity pretty quick, as multiple panes of glass are no fun for anyone!), we can utilise any S3-compatible storage to provide UNLIMITED snapshots.unlimited snapshot retention

This is pretty cool if you have to provide very granular restoration points for your application data, as well as the ability to off-site at relatively low cost in a near-infinite data storage facility!

Sadly, you can’t currently run VMs directly from those snapshots in the cloud, but bearing in mind that EMC has a software only version of the Unity already available, I have a sneaking suspicion that there will be some engineering talent working on this as we speak! This would potentially provide the ability to snap and replicate your entire estate natively to S3 buckets in the cloud, then restore very quickly locally within that IaaS platform. Let’s hope I’m right!

Want to Know More?

EMC’s sessions on Unity, Scale-IO and Isilon were recorded and are now available to stream online:

Some of the other SFD13 delegates had their own thoughts on the session and EMC in general. You can find them here:

Disclaimer/Disclosure: My flights, accommodation, meals, etc, at Storage Field Day 13 were provided by Tech Field Day / Gestalt IT, but there was no expectation or request for me to write about any of the vendors products or services and I was not compensated in any way for my time at the event.

Long Term Data Retention – What do I do?

One of the more common requirements I come across on a day to day basis working with organisations across a broad spectrum of industries is the question of how to manage long-term data retention.

Frankly, I have massively oversimplified the question as there are many more nuances to it that this! Some of the questions, discussion points and potential solutions I see when trying to scope out and define a long-term data retention strategy are below. We assume in this case that we are talking about backing up application data, but the same can apply to file data, such as from a file server.

Long Term Data Retention – Questions, questions, questions?!

Like beautiful snowflakes, ultimately it always comes back to gathering the requirements for the individual business.

What are the regulatory and compliance requirements for long-term retention of data, and what are the consequences for loss of that data? In the new world, this could be pretty serious, especially with things like GDPR right around the corner. Escalating this up the business hierarchy can get buy in from other parts of the business to provide additional budget outside of IT, for a solution to meet the actual requirements, not just a botch job which will likely fail when put to the test.

How long is the actual data retention required? Looking at most current applications, if we are relying on being able to read back data in 7 years, current or future backup software may still work, but will we have the kit to read the tapes or data? If using spinning rust as a storage media, do we expect to be able to migrate data from one disk system to another easily in future, and if so, how does that impact things like encryption, capacity, deduplication and compression of that data?

What is it that we are trying to protect against? Deliberate or accidental deletion, total destruction of a server, array or DC, or perhaps we just need to be able to prove what your data looked like at a specific date / time.

How granular does the data need to be? For example do we need to be able to pull a file version from a specific week in the past X years? The more granular we need to get, potentially the more expensive. If we have controls in place to protect archive data against accidental / deliberate deletion, then we may not actually need to keep more than a few days or weeks of backups (as an example).

The use of FIM (File Integrity Management) tooling can be very helpful in this regard, especially for flat file structures. They can track all changes to your file system and if something is removed or updated, you could alert your server teams to investigate why and restore it from a recent backup.

Can the application or server prevent deliberate or accidental data deletion? If the application can be treated as, or write to, WORM storage (Write Once Read Many times), then the risk of data loss is further reduced, especially if that storage can be replicated off site. This doesn’t really help much with things like SQL databases, however!

Where is the archive data for the application or solution actually held? Is it within the live system (e.g. the live DB), or can it be exported onto a tertiary archive system where it becomes Read Only to all parties, including administrators? Even better, can the application export the data into a generic format, more likely to be readable in 25+ years time (such as CSV, text etc)? This provides quite a bit more flexibility in terms of future access and recovery options.

Does the application or server provide RBAC, and has it actually been implemented yet? If we minimise the number of people who could update or delete data (maliciously or accidentally), we minimise the risk of data loss.

What is the budget for the solution? All singing, all dancing, physical or software solutions can be great, but you may not be able to afford them.

Are we looking for an appliance-based solution which includes storage, replication, backup plugins, etc, or do you already have the HW and just need some software? This often, but not always, comes down to a time vs budget question. Do you want to spend your team’s time managing clunky backup software, or just buying an appliance which does half the work for you and is policy based?

What are your sovereignty requirements for the data, and would a cloud-based service be appropriate for your business? It can be very cheap to store data in something like S3 or blob storage, if the business accepts this and you don’t need to pull any of the data back very often (if at all).

How quickly is the data required when requested, how large is a typical access request, and how often are they needed? If this can be hours or days, then an offline or cloud solution may be appropriate, but anything where immediate access is required, is a different story.

Similarly, will we want to restore or access this data in the event of a DR, does this solution form part of our DR strategy? Perhaps it’s only required for access to much older data because you are replicating the most recent data to a DR facility!

As we can see, there are many, many, [many!] things to think about when considering long-term retention of data in a backup or archive solution.

What brought this up Alex?…

… I hear you ask!

I recently attended Storage Field Day 13, where we had a presentation from a backup vendor, StorageCraft, who has been in the SMB and mid-market space for many years, and it got me thinking!

The latest iteration of their backup software provides a local cache with cloud integration, and the added ability to spin up a DR environment in the event of an outage to your primary DC. A pretty nifty feature if you are legally able to store your data outside of your local environment (they currently have DCs in the US and EU only).

They can also create backups using their proprietary SPF file format, which has apparently not changed since its inception around 15 years ago. There is also no concept of a media server, as each server manages its own backups (albeit with the ability to use a central scheduler tool). This gets around the issue of backup compatibility, though may limit their ability to provide additional data services for the backup files, such as encryption, dedupe or compression, outside that of the storage targets they reside on.

This is what tickled my mental matrix into deploying my keyboard! 🙂

Want to Know More?

The session was recorded and is now available to stream online:

StorageCraft Presents at Storage Field Day 13

Some of the other SFD13 delegates had their own thoughts on the session and StorageCraft in general. You can find them here:

Dan Frith – StorageCraft Are In Your Data Centre And In The Cloud

Scott Lowe – Backup and Recovery in the Cloud: Simplification is Actually Really Hard

Disclaimer/Disclosure: My flights, accommodation, meals, etc, at Storage Field Day 13 were provided by Tech Field Day / Gestalt IT, but there was no expectation or request for me to write about any of the vendors products or services and I was not compensated in any way for my time at the event.

%d bloggers like this: