I recently created a post to show how to integrate the Infoblox
plugin into vRA Cloud. This worked fine when I used the blueprinting component
of Cloud Assembly. However, when the request was made direct using the IaaS
API, the VM was created fine but it did not get an IP from my IPAM IP pool.
I had noted from the VMware docs, if you wanted to set a static IP address from the YAML based blueprint, you would use the property: assignment: static, however this property did not work as expected in the body of the API.
I was recently tasked with integrating Infoblox into vRA Cloud. There didn’t seem to be a lot of information on this so I thought I would write a blog post 🙂 Hopefully others find it useful.
This article will make assumptions that you are familiar with vRA Cloud and that you already have an IP range setup in Infoblox. Actually, the process to integrate is very simple , however we do have some pre-requisites that we need to deliver before we can begin. Make sure that you have downloaded the ABX plugin from https://marketplace.vmware.com/vsx/solutions/cas-infoblox-plugin-for-abx-0-0-1 and also ensure you add the following extensibility attributes in Infoblox:
At VMworld 2018, VMware announced their new Automation service, aptly named Cloud Automation Services (CAS) and earlier this year in January, it was announced as General Availability.
So what is CAS? Well it’s a multi cloud solution driven by the infrastructure as code methodology and delivered by VMware as a SaaS model. CAS is made up of 3 components, Cloud Assembly which allows for infrastructure and application delivery in line with devops principles, Service broker which provides a service catalog and finally code stream which focuses on the pipeline and continuously delivery. Some of these names will be familiar, e.g code stream but you should note that these are not just upgrades of previous products and they have been written from scratch for a brand-new experience.
In this post, I wanted to focus on Cloud Assembly and give a brief introduction to the service. I have been using vRA for a number of years and one main problem was the pain of installation. In short, different products stitched together (think of the SQL and postgres DB fight) meant it was a real pain to deploy consistently as part of an overall private cloud product.
I have used vRO for quite some time, yet I have never really had a need to use Composite Types – until recently! This vRO feature is pretty cool and allows you to create arrays which can be polled by a Workflow and what is really a benefit for me is that allow you to minimise the amount of WFs needed or even the amount of input parameters into a Workflow.
I am a real advocate of using configuration files in vRO, and as anyone who attended my Glasgow VMUG session will (hopefully) remember that these are used for global settings which mean we don’t need to update individual Workflows if we point to these central configuration files. So where do Composite Types fit in here? Well recently I had a requirement around DNS information which had potential to impact manageability of Network Profiles in vRA. The requirement was around how DNS information would be added to a VM during deployment depending on things like location or operating system. Using vRA out the box IPAM meant that in order to achieve this I would have to create many profiles just to map different DNS info and then resolve complexity of splitting the IP ranges within the Profiles. An alternative way to meet this requirement along with making it easy to manage and fulfill any need to grow as more sites where added was composite types!
Recently I was given a requirement to enhance a vRO workflow which added a VM to disaster Recovery policy in SRM. The existing workflow by default added all VMs added to Priority 3 (normal) start up order. My requirement was to allow the user to specify the start up order.
Having a quick look at the environment, I could see that the SRM plugin was used so felt this was a good start – however it soon became apparent that it wasn’t ideal for me given that the information we can get out of the plugin is limited never mind having to manipulate that data. Looking online , it seemed that using PowerShell was the common answer to automating this, but I also had a constraint of not introducing any new plugins. During my online hunt I found the SRM 6.5 API guide and this became a nice resource. By browsing this API guide it became apparent that SOAP API was my only option and I continued to refer to this guide in order to find a solution – https://www.vmware.com/support/developer/srm-api/site-recovery-manager-65-api.pdf.
I decided to write this blog because there seemed a sever lack of info on using SOAP for SRM. Continue reading →
Recently I sat the VCAP Design exam for Cloud Management and Automation based on vRA7.2. Previously I had sat the version 6 exam and this was based on the traditional split of visio based canvas scenarios and drag and drop questions. I learned that this version of exam has significant changes to it, and in fact there are no more canvas style questions. Indeed most questions are multiple choice with some drag and drop. The time allocation was also less than before, now only 130 minutes to answer 60 questions!
Going into study mode I felt confident having used vRA7.3 for some time now, however there are still slight differences between 7.2 and 7.3 that I had to brush up on. Additionally, due to the architecture of the product I work on, we don’t have a need to utilize all of what vRA can offer, so I also required a refresher on things like approval policies and the vRA portal.
So, where to start? I am lucky enough to have a lab in work where we develop, so I could use that for a “play around”. I created a new tenant and simply clicked everywhere and anywhere to get a feel for all vRA7 has to offer. I also completed some Hands on Labs from VMware. They are an excellent resource and cater for all levels. From here you can also click around – no need to follow the guide :). I did focus, however, on the vRA/NSX integration labs. I much prefer these labs to reading but I also brushed up on the design qualities that are always part of these types of exams. Having sat a few based on the DCV track, I always refer to Paul McSharrys official guide and also the DCD 5.5 Study Pack from Jason Grierson which is an excellent reference. I also should point out that the official exam guide here contains some really important references. Continue reading →
Recently I had the pleasure of presenting on vBrownbag with my colleague Konrad Clapa. Konrad is a double VCDX in DCV and CMA and I am very proud that we where able to speak about our vRO and vRA best practices. We have worked together for around 3 years now developing and architecting the Service Catalog for Atos DDC and DPC products. See our session below and get in touch with any questions ……
Recently I wanted to learn more about AWS, mainly for career progression but also because of the noise made this year with VMware and AWS joining forces and the shift towards Hybrid Cloud.
As usual, for motivation, I decided to set the exam date as a focal point to aim for. But uncharacteristically I pushed and pushed this exam and lacked the motivation to study. It was something that was also on my work development plan and had to be achieved and I soon found myself in the start of December without achieving this. Thankfully I was able to successfully pass the exam and am now AWS-SAA.
In order to begin study for this, I started where it seems everyone does and purchased the “acloudguru” course. I bought this of udemy for around $10 around march 2017 – that shows you the levels of my procrastination (albeit I had a failed VCDX attempt this year as well to navigate) The course is a really good baseline for those who have never worked with AWS. The chapters are nice and short, around max 20mins and this is intentional to keep the attention of the listener. Beware you need to give a lot of time to get through this course. There are a lot of labs that you an follow but I found I had to repeat things a few times. Its also worth noting that although the official “acloudguru” site is now subscription based, you can still get good deals from udemy for the individual associate courses. Continue reading →
Recently, we decided to test vRO 7.3 in clustered mode. In previous versions we have not had a great experience with vRO clusters and as a result have always had single vROs in Master and Slave setup. With the latest version seemingly giving more stability we have created a POC and Load Balanced as much as possible. I noticed a lack of blog posts about setting this up and decided to add one here.
At this stage I am assuming some pre-requisites met and design decisions have been made:
Load Balancers and DNS names have been setup for vRA and vRO
vRA has been configured and will be used as vRO authentication mode
SQL will be used as the Database for the cluster
vRO appliances have been deployed and powered on
Set Up Initial vRO Instance
Logon to the fist of our 2 vRO appliances using the control center URL – https://hostnameoffvro001.domain.local:8283/vco-controlcenter
A few years ago, I was asked to deploy vCAC (as it was known then). Soon after I found myself part of a team dedicated to creating a new Service Catalog for an SDDC based Private Cloud offering. It was a huge learning curve for me and I was soon immersed in a world of Cloud development with decisions to make based on these new VMware Cloud tools. The one thing that I did learn very quickly was the importance of version control. Coming from an infrastructure background, this was alien to me but it soon became one of the most critical things I learnt about successfully developing a Private Cloud and more importantly – maintaining it!
At the heart of our Service Catalog was vRealize Orchestrator and as requirements for automated Catalog items grew, so did the team. This caused a lot of issues, with many developers working simultaneously on the same product and as a result changes to the same Workflows occurred and relevant changes lost. It soon became apparent we where lacking a sensible way to ensure our final packages was bug free and not overwritten unintentionally. Natively in vRO we can export a package containing Workflows, Actions, Configuration Files etc, but this is not in an ideal format to efficiently review or track changes. It was becoming impossible to keep tabs on what was happening.