Wednesday, 19 July 2017

Nakivo Backup & Replication v7.2 Beta Now Available

Recent Nakivo Backup & Replication announced Nakivo Backup & Replication v7.2 Beta  which is now available to download with lot of cool new features and functionalities like.

Asustor NAS Support


With Asustor NAS Support we can turn ASUSTOR NAS into a VM backup appliance, which help us to offload VMware or Hyper- V Infrastructure to a low cost NAS and to  keep our backup software,hardwarestorage, and data deduplication within the same device like we have  been doing with QNAPSYNOLOGY and Western Digital NAS.

Transaction Log Truncation for Microsoft SQL Server


Another feature that has been made available in Nakivo Backup & Replication v7.2 Beta which can automatically truncate transaction log files post successful Virtual Machine Backup or replica is created, preventing log files from growing.

Instant Object Recovery for Microsoft SQL


With this feature we can instantly recover SQL objects (database and tables) directly from compressed deduplicated VM Backups, when working on recovery of the objects we can recover the objects back to SQL Server or to a different SQL Server.

Flexible Job Scheduler


Enhancements has been made in Job Scheduling which helps us to add multiple schedules to single job, allowing us to run a single job at different times for example on weekday and weekend wherein we can make the Job to run on Weekdays in night when we have less workload and at the same time we don't have to run the same job in night on weekend because of less workload we can run it in day time as we may have less work load over weekend.


Calendar Dashboard


With this new feature we can view all your jobs on a calendar like your meetings and quickly find free slots in a backup window like we make use of scheduling assistant for our meetings. This feature also helps us to schedule new jobs from the calendar by providing a birds-eye view for all our jobs. To make this feature even more useful, calendars are integrated right into the job wizard, so you can schedule a job right in the job wizard.

Download your free trial here Try Nakivo Backup & Replication v7.2 Beta

Monday, 10 July 2017

10 Things we need to know - Vembu Cloud

I have already dedicated couple of articles talking about Vembu BDR wherein we have discussed about Top 5 reasons to choose Vembu BDRProduct Review Vembu BDR and we have also seen how we can Replicate Virtual Machine using Vembu BDR.

Dedicated this article to talk about 10 Important points we need to know about Vembu cloud services wherein we will be talking about Vembu CloudDRVembu Online BackupVembu SaaS Backup and Vembu BDR360.

  • Vembu CloudDR provide an extra data protection to the backed-up data by replicating the backed-up data to Vembu Cloud hosted in AWS, it also gives us an option to create a redundant copy of the backed-up data from multiple backup servers to Vembu cloud storage. In a case of backup server crash or hardware failure or any other disaster, the replicated data which is highly secure as it uses AES 256-Bit encryption can be restored from the Vembu cloud anywhere and anytime.
  • Vembu CloudDR services can be availed on a Pay-as-you go basis with 10GB of free storage with its availability in various regions including America, Asia Pacific and EMEA and it also gives us multiple options to restore including live VM boot on Vembu cloudrestoring via the internet anytime anywhere, and we can also download the restored data to USB drive.
  • Vembu Online Backup provides end-to-end secure backup for Windows, Linux & Mac Machines and applications like Exchange, SharePoint and SQL and is designed to meet compliance requirements based on local regulatory laws.
  • With Vembu Online Backup we can backup individual files and folders on Windows, Linux and Mac Machines irrespective of the fact whether they are hosted in virtual or physical environment, these files can be backed-up without any restriction in terms of size and the type of folders we have planned for backup.
  • When using the Vembu Online Backup for Microsoft SQL backup we can make use of MS SQL plugin which can help us to backup critical database and tables to Vembu Cloud and an entire SQL server can be recreated directly from Vembu Cloud.
  • When working with Vembu Online Backup administrators get a variety of scheduling options to choose from depending upon the requirements on how frequently the backups needs to be performed, as far as the retention of the backed-up data is concerned we can choose from either time-based retention or version based retention.
  • It’s time to consider another service which is part of Vembu Cloud and provides us backup solution for Office 365 and Google AppsVembu SaaS backup provides us secure cloud backup for backing up emails, drives, calendar, and contacts in office 365 environments.
  • Vembu SaaS backup is much needed to avoid any potential data loss for our business which can happen because of Hackingunauthorised access and data loss as it provides us various features including Automated Scheduling (Runs Backup on Daily Basis), Quick Search (Search email’s, documents and contacts while configuring the backup).
  • With Vembu SaaS backup we can download our backed-up google documents with single click means there is no need to restore the user account to access the file irrespective of the location and time.
  • Another solution available as part of Vembu Cloud is Vembu BDR 360, which provides us centralized management and monitoring portal to have a better visibility of our IT infrastructure currently backed-up by Vembu Backup and Disaster recovery solutions, with Vembu BDR 360 we can get a 360 degree birds eye view for all our Virtual Machines, Physical Servers, Applications and all the endpoints being protected by Vembu BDR suite.

Wednesday, 5 July 2017

Back to Basics - Part 2 Amazon Elastic Compute Cloud aka EC2

As we have recently started another Back to Basics Series for AWS related blog posts, this is our second blog post dedicated toward the series. In our first blog post Back to Basics - Part 1 An Introduction to Amazon Web Services we had a quick overview about AWS and also talked about the various associate and professional level certifications available, dedicated this article to talk about Amazon EC2.

Amazon Elastic Compute Cloud (EC2)

Is one of the most widely used web service which provides us resizable compute capacity and helps us to reduces the overall time which is required to obtain and boot new Amazon EC2 instance to minutes.Security is not a problem because we can make use of security groups and network access lists to control the incoming/outgoing connections of our instances.

When launching an EC2 Instance the initial software that run's on it is known as Amazon Machine Images (AMIs) which includes operating system, application or system software. While working with AMI there are various options available to us to choose from including the one which is Published by AWS with different versions of operating system (Windows and Linux), The AWS Marketplace an online store where we can find the desired software from different vendors which will be running on our EC2 instance, we can also generate an Amazon Machine Image from an existing instance last but not the least we can also make us of AWS Import/Export Service and can create images which could be VMDK, VHD or OVA. 

When it comes to buying an EC2 Instance there are various options available to choose from depending upon the load/traffic/requirements.

On Demand Instance let us pay for compute capacity on an hourly basis and we can increase or decrease compute capacity depending on the demands of our application and can considered as one good option when we are planning to run our application for the very first time on AWS.

Reserved Instances will gives us the assurance of the instances which will always be available for us in the availability zone, including a discounted price as compared to the On demand Instance. When choosing our reserved instance we have three different options to choose from i.e All Upfront - Where we are paying for the entire reservation upfront, Partial Upfront - Paying small charges upfront and remaining as monthly instalments and the last option is No Upfront - Nothing to be paid in advance rather in monthly instalments.


Spot Instances allows us to specify the maximum hourly price which we are willing to pay to run a particular instance type. Amazon EC2 sets a Spot Price for each instance type in each Availability Zone, which is the price all customers will pay to run a Spot instance for that given period. One of the use case for spot instance could be the workloads which are not time critical wherein we can specify the price we are willing to pay and when our bid price is above the current spot price will receive the requested instance.

Dedicated Hosts as the name suggest is the EC2 Server with the overall capacity to be utilized by us and helping us reduce costs by using our existing server-bound software licenses, including Windows Server, SQL Server, and SUSE Linux Enterprise Server (subject to your license terms), and can also meet compliance requirements.

Tuesday, 4 July 2017

Back to Basics - Part 1 An Introduction to Amazon Web Services

Spending time these days to learn about Amazon Web Services and as part of this journey trying to make some notes for my own reference which I can refer while preparing for the AWS certifications, so thought of making these notes available on my blog as a series of blog posts and named them as Back to Basics - Amazon Web Services Part 1, Part 2 and so on as I have been doing it for VMware Back to Basics Series.

Will start this Back to Basics Series blog - Part 1 For Amazon Web Services by giving a quick overview about Amazon Web Services and the various certifications available depending upon our areas of interest (ArchitectDeveloper or SysOps).

Overview Amazon Web Services

Amazon Web Services is the infrastructure service (Compute + Network) which can be leveraged with storage and other services offered by Amazon that helps us to deploy services own demand without worrying about the underlying infrastructure as now we don't have to wait for the underlying hardware to be procured, rack/stack, install and configure it, with Amazon Web Services we can choose from a wide variety of services and spin them up in minutes.

Elastic Computing is the Key when working with AWS which allows us to Grow or reduce  in minutes not just the compute resources but the storage resources as well, we can also choose from various storage choice available and do tiering (Fast + Medium and Low Performance) as we have been doing in an on - premises environment.

When it comes to accessing the AWS environment we can make use of the AWS Management Console (Web Application for managing AWS Cloud Services), the AWS Command Line Interface (CLI to manage and automate cloud services) or AWS Software Development Kit (SDK) (Application programming interface that interacts with AWS Web Services).


As a matter of fact there are lot of services to talk about and will be trying my level best to dedicate article in Back to Basics - AWS series for each service type and demonstrate how to create/ manage those services, I have already setup my account on AWS and long back dedicated an article talking about Amazon EC2 Instance which falls under compute category, maybe i will change the heading of that post and make it as Back to Basics Part 2 - AWS EC2 Instance ensuring all the AWS Back to Basics Series blog posts are properly aligned.

AWS Certification

It's time to check various certification tracks available with AWS and believe me if you are some one who is starting his journey with AWS like me I would strongly recommend you to have some understanding about Virtualization, Cloud Computing ,Networking and Storage so as you to enjoy this Journey.


As we can see there are various tracks which are available to us depending upon our areas of interest, we can design the AWS infrastructure (AWS Certified Solution Architect Associate and Professional), or we could be the one who is going to manage the services and troubleshoot it (AWS Certified SysOps Administrator Associate and AWS Certified DevOps Engineer Professional), another track is available for those who are from Development Background (AWS Certified Developer Associate and AWS Certified DevOps Engineer Professional), yes we need to have both the admin skills and development related skills if we are planning to achieve professional level certification for both development and operations track.

Amazon Web Services also offers speciality certifications which can be done after we have have completed an Associate Certification in any of the roles, wherein we can may progress to a Specialty in Advanced Networking or Big Data, depending upon the areas of interest.

Important Links -







Wednesday, 28 June 2017

Back to Basics - Part 16 vSphere Storage DRS

In our last blog posts related to Back to Basics Series we discussed about Virtual Machine Files (Part1)Standard Switches (Part2), vCenter Server (Part 3),Templates (Part4) vApp (Part 5), Migration (Part 6),Cloning (Part 7), Host Profiles (Part 8), Virtual Volumes AKA VVOL's (Part 9) Fault Tolerance (Part10) ,Distributed Switches (Part 11) and Distributed Resource Scheduler Part 12, vCenter Server High Availability (Part 13), Back to Basics Part -14 Creating Reports in VROPS and Part 15 Understanding VMware App Volumes
we also discussed about the various tasks related to building Home Lab Part1Part 2Part 3,Part 4 Part 5.

I Recently had to rebuild my Home Lab due to some recent hardware changes and now when the lab is up and running thought of dedicating an article talking about Storage DRS which is another important feature available in VMware vSphere but before we proceed and understand how can we enable it and what are the features provided let's have a quick overview.

Storage DRS can be enabled on datastore cluster where we can add multiple datastore which work together to balance capacity and I/O Latency, but before we enable SDRS  we need to ensure that all the datastore which are running in a cluster are either NFS or VMFS and are mounted to ESXi5.0 and later.

Storage DRS Cluster and Host DRS cluster (Host HA Cluster and Host DRS Cluster) can coexist in the environment wherein the load balancing for your compute resources taken care by Host DRS and load balancing for storage capacity / I/O latency taken care by Storage DRS can occur at the same time.


VMware vSphere Storage DRS works in pretty much the same fashion as Host DRS and helps us to initially place the Virtual Machines based on the storage capacity and I/o Latency to the respective datastore and make use of storage vMotion to balance load across the datastore which are part of the datastore cluster.

While Creating the SDRS cluster we can choose Manual Mode where vCenter Server generates recommendations for Virtual Machine Storage Migrations but won't perform any actions however when Fully Automated mode is selected the Virtual Machines Files will migrated automatically and we can also make us of Anti affinity and Affinity rules to stop/allow the Virtual machine disk to be migrated to different datastore.

Initial disk placement option comes pretty much handy at the time we are creating, cloning or migrating a Virtual Machine and instead of selecting the single datastore we select the Datastore Cluster and Storage DRS decides which datastore to be used for initial placement of the Virtual Machine Disks based on the storage usage.

All other options available including Space Balance Automation Level / I/O Balance Automation Level / Rule Enforcement Automation Level / Policy Enforcement Automation level and VM Evacuation Automation Level can be adjusted accordingly using Manual Mode or Fully Automated Mode.


Next option in the list is option either can be enabled but when disabled beware that vCenter Server doesn't consider I/O metrics when making any Storage DRS recommendations and also the initial placement of Disks would happen based on space and I/O metrics wouldn't be considered.

We can also specify the space threshold based on the Utilized Space or Minimum Free space and I/O latency threshold by specifying the minimum I/O Latency for each datastore.


Next options on the list are pretty straightforward wherein we have to select Cluster/Host and the datastore that we are planning to be part of this datastore cluster, I recently created two NFS datastore for both my ESXi so used them.

Tuesday, 27 June 2017

My Nakivo Backup & Replication EC2 Instance is Up and Running

We have already dedicated couple of articles related to Nakivo Backup and Replication v6.1 wherein we have seen the architectural components and also talked about new features available in Nakivo Backup and Replication v6.1 here is a link for your quick reference Demystifying Nakivo Backup and Replication v6.1

We also discussed about Backup/Recovery of Active Directory Objects with Nakivo Backup and Replication v6.1 in case you missed it here is the link for your quick reference Backup/Recover Active Directory Objects with Nakivo

Apart from testing the backup and recovery related functionalities using Nakivo Backup and Replication v6.1 we also had a detailed discussion on Replicating Virtual Machines here is the link for your reference Replicate VM's with Nakivo Backup & Replication

We also talked about working with Nakivo Backup and replication appliance and what's new withNakivo Backup and replication 6.2 which was announced by NAKIVO on october 13th 2016 which help us by providing backupreplication, and recovery of paid EC2 instances sold through AWS Marketplace we also discussed about What's New - Nakivo Backup and Replication v7 Part 1 wherein we have seen some highlights about Hyper-v support which has been added for Hyper-v 2016 and 2012 with Nakivo v7.

In Our last articles related to Nakivo Backup and Replication series we discussed about 

In our recent post dedicated about few weeks ago we have seen how Nakivo Backup and Replication helps us to protect our Backup by creating Backup Copy 

In our last post related to Nakivo Backup and Replication Series we discussed about the upcoming features in Nakivo Backup and Replication v7.2

Dedicated this article to talk about how to set up NAKIVO Backup & Replication AWS EC2 Instance, which can back up AWS EC2 infrastructure in the same region, across regions, or to on-premise site.We can simply login to AWS marketplace and search for NAKIVO Backup and Replication the current version available is v7.1.0. EC2 charges for micro instances are free for upto 750 hours a month if we qualify for AWS free tier. https://aws.amazon.com/marketplace/pp/B01BKFOEP0


Before you click continue and proceed further with the configuration of your Nakivo Backup and Replication AWS EC2 Instance check the region and the pricing as the pricing can vary based on the region being selected.


Once we click continue we can either proceed with 1-Click launch which will give us a quick overview to select Region, Choose the EC2 Instance Type, in this blog I will be proceeding further with t2.micro instance which is running with 1Gib Memory and 1 Virtual Core, we can also review VPC settings and security group wherein we can create a New VPC if there it's not already existing and create a subnet within VPC.

While specifying the Security Group we can create new based on seller settings or choose from the drop down menu and can make use of the existing ones. Which will give us few more options to chose the connection method wherein we can specify the Source IP (Anywhere, My IP, Custom IP).


The Second option in the list is the Manual Launch option which provides us the version details and Amazon Machine Instance Id's which is required to launch an instance in cloud.
The last option is Service Catalog which allows us to copy the Nakivo Backup and Replication software to Service Catalog in a specific region where that service is supported and then we can further assign users and roles.

In this Demo I selected 1-Click launch option to create my Nakivo Backup and Replication EC2 instance and created a new key pair.


Once the instance is up and running we can login using the default username admin and the password would the AWS EC2 instance id.

Wednesday, 21 June 2017

Ensuring Business Continuity With Vembu BDR

Recently got invite to attend Webinar on How to Address Data Center challenges hosted by Vembu Technologies which is a privately held, information technology company that specializes in developing software products in BackupDisaster Recovery and cloud storage domains

The Webinar will held on Tuesday July 4 @ 12:00 PM GST and I was able to get a brief insight about the topics being covered so though of dedicating a quick post covering those points which will give you an idea why you should attend this webinar.



1) How Fast can we recover data center during disaster - This is one of the most important question we need to address when designing the environment and Recoverability is considered to be part of the design phase, the webinar gives us the insight on how fast we can recover our data center during disaster.

2) Another Important point which will be addressed during the webinar is how can we avoid data loss during Disaster Recovery.

3 ) Planning P2V and V2V migrations

Save your spot - How to address Data Center Challenges 

Tuesday, 20 June 2017

Upcoming Features - Nakivo Backup and Replication v7.2

We have already dedicated couple of articles related to Nakivo Backup and Replication v6.1 wherein we have seen the architectural components and also talked about new features available in Nakivo Backup and Replication v6.1 here is a link for your quick reference Demystifying Nakivo Backup and Replication v6.1

We also discussed about Backup/Recovery of Active Directory Objects with Nakivo Backup and Replication v6.1 in case you missed it here is the link for your quick reference Backup/Recover Active Directory Objects with Nakivo

Apart from testing the backup and recovery related functionalities using Nakivo Backup and Replication v6.1 we also had a detailed discussion on Replicating Virtual Machines here is the link for your reference Replicate VM's with Nakivo Backup & Replication

We also talked about working with Nakivo Backup and replication appliance and what's new withNakivo Backup and replication 6.2 which was announced by NAKIVO on october 13th 2016 which help us by providing backupreplication, and recovery of paid EC2 instances sold through AWS Marketplace we also discussed about What's New - Nakivo Backup and Replication v7 Part 1 wherein we have seen some highlights about Hyper-v support which has been added for Hyper-v 2016 and 2012 with Nakivo v7.

In Our last articles related to Nakivo Backup and Replication series we discussed about 

In our recent post dedicated about few weeks ago we have seen how Nakivo Backup and Replication helps us to protect our Backup by creating Backup Copy 

Dedicated this article to talk about some upcoming features in Nakivo Backup and Replication v7.2 which has been recently announced by Nakivo on June 19 2017.

VM Backup Appliance Based on Asustor NAS 


Nakivo backup and replication appliance soon would be available on ASUSTOR NAS which will help to turn ASUSTOR NAS into a VM backup appliance, the solution will help us to offload VMware or Hyper- V Infrastructure to a low cost NAS and to  keep our backup software, hardware, storage, and data deduplication within the same device like we have  been doing with QNAPSYNOLOGY and Western Digital NAS.

Microsoft SQL Log Truncation

Microsoft SQL Log Truncation is another feature which would be made available with Nakivo Backup and Replication v7.2 and has been announced which can automatically truncate transaction log files post successful Virtual Machine Backup or replica is created, preventing log files from growing.


Calendar Dashboard

Another cool feature which helps to schedule backup jobs from calendar and provides us a quick overview of all the Jobs with duration.


Instant Object Recovery for Microsoft SQL

Nakivo backup and replication helps us to instantly recover SQL objects (database and tables) directly from compressed deduplicated VM Backups, when working on recovery of the objects we can recover the objects back to SQL Server or to a different SQL Server.

Monday, 19 June 2017

My VCAP6-DCV Deploy Journey

Last week I passed VMware Certified Advanced Professional 6 - Data Center Virtualization Deployment exam, the Journey was awesome and indeed require a lot of dedication and hard-work .

Dedicated this blog post to help fellow community members who are preparing for this exam, the article 
talks about my exam experience and also
 provides the list of Books, Blog's and other sources I referred while preparing for this exam.

My Exam Experience

I booked my exam for Thursday afternoon and reached the exam center 1.5 Hours before my exam slot ensuring I have enough time to relax and if in case there are some technical issues with exam center I should be aware of them well in hand and this is something I highly recommend to all fellow community members who are planning or preparing for this exam ensure you reach early on your exam day. 



Again it's not mandatory to reach exactly 1.5 Hours before :-) reaching 45 mins before your exam slot would give you enough time to prepare yourself and will help you save a lot of time when writing your exam, for sure no-one want's to waste their precious time using washroom during the exam :-) It's a long exam and preparing yourself well in advance will help you save a lot of time.

Stay Calm

When starting your exam stay extra cool as you may have to spend some time understanding and adjusting the desktop resolution as per your preference and loosing patience is something which is not going to help. I spent almost 15 Mins in understanding the overall look and feel of exam and adjusting the screen as per my preference, would highly recommend everyone to visit Hands on Labs and test out few labs which will give you a brief idea on how to deal with the desktop during the actual exam.

When writing VCAP6-DCV Deploy exam every seconds count and if you are spending time reading the questions throughly before starting the task it's going to be beneficial at-least it was in my case because I was able to save a lot of time when doing the tasks.

Be aware that CRTLALT and BACKSPACE are not working go back using arrow keys and then press delete it's better to make use of on screen keyboard if you wish to copy and paste.

The Preparation ?

Being a VMware Certified Instructor I deliver a lot of VMware vSphere courses and consider them as the foundation for writing VMware Exams, because these courses are properly aligned with exam blueprint and they not only give you the knowledge that you need but also give you various scenario to practice in a lab based environment.

Apart from VMware vSphere Design and Deploy fast track v6 course which is recommended training before you write this exam,There are few more trainings course that I personally recommend all to attend VMware vSphere 6 Optimize and Scale and VMware vSphere 6 troubleshooting workshop I learned a lot of great technical stuff at the time I attended and while preparing for delivering these courses.

If you are preparing for VCAP6-DCV Deploy exam make sure you align yourself with exam objectives and simulate every objective in your Home Lab environment. If you are not too sure on how to simulate the objectives in your Home Lab environment then here is the blog post you should refer VCAP6-DCV Deployment Guide, If you are one of those who has not yet deployed your Home Lab believe me you have a long way to go, I have dedicated a couple of articles which talks about Home Lab deployment here is the link for your reference My New VMware Home Lab is Spinning.

Important Links

As I quoted above it’s was a long journey and a lot of dedication and hardwork is required to achieve these credentials, below are some of the technical guides which I referred while doing the preparation and strongly recommend all to go through before you write the exam.




Wrap up

Passing the VCAP6-DCV Deploy exam need a lot of efforts and can be achieved by properly aligning yourself with exam objectives and simulating them in your Home Lab. Once you have practice these objectives multiple times in your Home Lab you are good to go, but as I said going through the objectives is one thing and the other important aspect is to stay cool and calm while writing your exam, read the question twice and understand the requirements before you begin with the task.