RSS

Category Archives: Microsoft

Getting started with Kubernetes

I have started understanding Kubernetes aka k8s as part of my work recently and went through some learning. I will share some high level notes which I have made while reading through different content and also the links to resources for a quick getting started.

Pre-req:

  • Good to have a high level understanding of containers (Docker etc)

Learning links:

  1. Kubernetes tutorial – this is the first link I went through to get a very high level overview of what Kubernetes is and how do we go about using different commands on the k8s cluster.
  2. Scalable Microservices with Kubernetes – this is a very good course on Udacity which gives very good overview. This same course is referenced from the above tutorial.
  3. Using Visual Studio Team Service to deploy applications to Azure Container Service – This blog talks about using VSTS Release Management solution to automate deployments to K8S

Learning notes:

  • In simple terms you can consider k8s as a platform for containers cluster management.
  • A k8s cluster has a master node – which manages the cluster and a set of other nodes – which are workers that run the applications. For a production traffic scenario, k8s cluster should have atleast 3 nodes
  • Key terms in k8s – Deployment, Pods, Volumes, Services, kubelet, ConfigMaps/Secrets etc
  • minikube is a light weight k8s implementation that creates a VM on local machine and deploys a simple cluster with one node.
    • While following the Hello Minukube section on windows, you can run “minikube docker-env” instead of “eval $(minikube docker-env)“. Once the “minikube docker-env” command is executed it prints a message with the subsequent command (a for loop) to be run to set certain variables, ensure to run that step
  • kubectl commands are used for all the cluster management

Lets understand the key terms:

  • Deployment: 
    • is a declarative way to say what goes where in the cluster
    • use ReplicaSets to maintain enough number of Pods as the desired config state
    • used to enforce the desired state as provided by the user
    • if a node goes down, deployment takes care of creating a new pod and place it into the available nodes
    • create a ReplicaSet to handle pod creation/deletion/updation
    • own and manage ReplicaSets
  • Pods:
    • Logical Applications
    • each pod has one more containers. When you have apps that have hard dependency on one another, you package them to the same pod
    • Shares volumes. Volumes live as long as the pod lives
    • has a shared namespace – help in containers to communicate with each other
    • each pod has one IP address
    • containers in a pod can communicate with each other using inter-process communication (IPC). Containers in different pods have different IPs, hence can’t communicate using IPC
    • Why Pods?
  • Volumes: 
    • you can consider them as shared storage
  • Kubelet:
    • each node has a kubelet
    • manages the pod and containers running on it
    • its an agent which helps the nodes to talk to master
  • Services:
    • defines logical set of pods and policy by which to access them
    • provide persistent endpoints for pods
    • enable loose coupling between pods
    • are defined using YAML/JSON
    • set of pods targeted by a service is determined by label selector
    • pod’s IPs are not exposed out of cluster without service
    • services have a integrated load balancer to distribute network traffic to all pods
    • you can expose you pods/containers using “kubectl expose” command
    • While creating the service following the “Hello Minikube” Section, set the type as NodePort in case the LoadBalancer type doesn’t work.

Kubernetes is very actively worked on and many things are rapidly changing. There is an interesting talk from Deis (PaaS on Kubernetes), in which it is mentioned that in CNCF “Kubernetes  has been offered as a seed technology” – this also signifies that k8s is a good technology to invest your time to learn.

PS: Azure has a very strong support for containers, container registry, different orchestration platforms like Docker Swarm, DC/OS, K8s. Do try out things on Azure and keep learning.

Advertisements
 
Leave a comment

Posted by on July 27, 2017 in Containers

 

Tags: , , ,

Continuous Delivery to Azure App Services using VSTS Release Management

Our team has been working on a very important feature addition to Azure App Service for the last couple of months which went Preview in the recent connect() 2016 event. I have built most of the UX and back end for enabling this experience.

This is feature is about how to make it easy for a customer to configure Continuous Delivery/Deployment from the new Azure Portal for a given App Service. The Source code can be either in VSTS (currently only Git Repos are supported) or Github and the applications can be either ASP.NET or ASP.NET Core.

A detailed blog post can be found here – Continuous Delivery to Azure App Service

A video of the demo shown at the Connect() event by Brian and Jamie can be found below (You can skip to 55 minutes to start seeing our experience :))

Do try out the new feature and share your comments.

https://channel9.msdn.com/Events/Connect/2016/ALM-DevOps-with-Brian-And-Jamie/player

 
Leave a comment

Posted by on December 28, 2016 in DevOps, Microsoft

 

Tags: , , , ,

Load Testing Azure WebApps from Microsoft Azure Portal

I am excited to share about a great feature added to Azure Web Applications experience in Azure Portal, which is the capability to do Load Testing of the web application right from the Azure Portal. There is a detailed blog post on Visual Studio MSDN Blogs about this announcement – Announcing Public Preview for Performance/Load Testing of Azure WebApp. Similar announcement on the Azure blog – Public Preview of Performance Test on Web and Mobile App.

Here are two videos on the topic on Channel9 (Do watch them):

  1. Azure Friday Video on Performance Testing Web Apps
  2. And follow up video on doing Advanced scenarios from Visual Studio Enterprise

I know what you are feeling, the experience and the User Interface looks just brilliant, right?. Yes, Its brilliant. There has been a great effort behind it building this experience :). There were good challenges and great great learning as well.

And I do think you would be interested in understanding how is the new Azure Portal being built. There is a very good blog from Justin Beckwith which explains this stuff perfectly – Under the hood of the new Azure Portal. I recommend this as a must read blog to know about the new portal. Make sure you watch Steve Sanderson video as well, its really cool.

Please do try out the new feature and share your valuable feedback.

Question and Answers.

Question: How to use an existing Visual Studio Online account work with the new Load/Performance Test Experience in Azure Portal?

Answer: Your Visual Studio account should AAD backed. To know if you account is AAD back you can go to the account settings and figure if it AAD backed. Else you can follow the link provided there 

There is a blog post which should make things easy: Link Visual Studio Account to Azure

Question: [A very very rare scenario] I am facing issue setting an account like mentioned below. How to unblock my self?

SadCloud

Answer: You shouldn’t be seeing this at all. But there is a very negligible chance you might face this. In such cases, you can click on the gear button at the top right corner of the Azure Portal that takes you to the “Portal Settings”. Here you can click on the “Discard modifications” button and things should start work.

Gear

Settings

The impact of this change is that it would clear any user settings in the Azure Portal like the pinned tiles on the home page will be cleared and specific user settings that are customized would be reverted and portal would be started afresh. Please note that it would not clear any application data.

 
Leave a comment

Posted by on September 16, 2015 in Cloud Load Testing, Load Testing, Microsoft

 

Tags: , ,

Roslyn – The .NET Compiler platform

Roslyn is the .NET Compiler Platform which provides open-source C# and Visual Basic compilers with rich code analysis APIs. Roslyn exposes modules for syntactic analysis of code, semantic analysis, dynamic compilation to CIL, and code emission

I heard about Roslyn during one of the Hackathon events but didn’t get a chance to explore it till couple of days back. As part of the recent Hackathon, I got a chance to work on a project which needed Roslyn help to make my work quick and easy. I was amazed at the easy at which we can use the compiler service. This tutorial helped me to quickly get on track, thanks to the author – Learn Roslyn Now. The same author have a FAQ section which gave me a path to many of my requirements for the project like given a particular piece of code in a workspace how do I find all the references across the solution etc etc.

This Hackathon was a great learning and a great fun. The most memorable part being demoing this project to one of our CVPs :). I should be very thankful to one of my team mate for the idea!

Do read about Roslyn and get amazed 🙂

 
2 Comments

Posted by on July 31, 2015 in Coding, Microsoft

 

Getting started with Git (with Visual Studio)

What is Git?

Git is a free an open source distributed version control system designed to handle everything from small to very large projects with speed and efficiency. (Ref : git-scm.com)

I have been using Git in Visual Studio (my primary IDE for development) for the last couple of weeks and found it really amazing. I went through few tutorials on the web and created the below learning list which would pretty much get you started quickly and give a very good understanding of how git work. I felt Git is so amazing once you start using it you would probably wouldn’t want to go back to Centralized Version Control again 🙂

1. Use Visual Studio With Git – MSDN article on getting started with Git on VS

2. http://git-scm.com/ – Official Site

3. A good tutorial on Microsoft Virtual Academy – Using Git With VS – I really liked this tutorial. Probably takes 8 hours of your time.

4. Linus Torvalds Talk in Google

5. Git for Ugly and Stupid people

6. Git branch naming best practices

7. Git Tutorial on Atlassian

You can search for more in-depth articles on Git. But the above links should make you comfortable with Git.

Please do share other useful links on Git in the comments.

 
Leave a comment

Posted by on September 5, 2014 in Microsoft

 

Tags: , ,

Cloud load testing with Visual Studio Online

This post has been long delayed from me. I have been working on Cloud Load Testing (CLT) with Visual Studio Online (VSO) for close to an year and half and it has been a great learning curve. Got an opportunity to work on various areas like Scalability, building frameworks to understand the service usage, integrating with other VSO services, building service dashboard and making my hands dirty building UI. A fun ride. A really proud thing is that this product is completely built ground up in Microsoft IDC, Hyderabad.

Cloud Load Testing Service in short:

The service will greatly reduce the amount of work required in setting up the infra-structure for load testing an application, configuring it with right settings and maintaining the environment.With CLT in place, the service will take care of setting up the agents, deploying your tests, and running your tests automatically, so you can just focus on what matters the most – finding and fixing performance and scale related problems in your application.

I am going to point to a series of good links to explore about CLT

1. Super Simple Load Test Trial Experience announced by Brian Harry – With this announcement customers can do a Load Test from Visual Studio Online (visualstudio.com).This experience is visible as of today to only users with MSDN Ultimate License. In case you don’t have an Ultimate license you still get a view of the experience from this Channel 9 Video demo’d by Chuck – Load Testing Made Easier

2. Brian Harry announcing about CLT – We all watched this demo with lot of excitement 🙂

3. Load Test your App – MSDN Article

4. CLT General Availability note 🙂

– This link has various links to getting started docs

– With the release of GA, customers can monitoring metrics from their Applications under test using Application Insights and quickly troubleshoot any performance issues.

5. Link to a good video

You get 15000 VUser minutes if you have an account on VSO. Do try out and share your feedback.

Note: All the text in this blog is written by me and my employer holds no responsibility 😛

 
 

Tags: , ,

Migrating from Azure Caching service to Dedicate Cache

1. Introduction:

This post is mainly targeted at helping users in migrating their existing cloud project that run with Azure caching service to dedicate Cache. To get an introduction on what’s new with dedicated cache please visit Distributed Cache section in Scottgu’s blog post – here

Two things to do before starting

1. First thing to do before starting the migration plan is to decide which of the two deployment modes would you like to use – either Enable cache in one of your existing roles which makes cache to co-exist with your app (let’s call this is Co-located) or host cache in a separate role (let’s call this Dedicated Role for purpose of our discussion).

2. Install/Upgrade to latest 1.7 SDK.

2. Enabling and using Cache in different deployment modes

OK. Now you have decided on which dedicated model to use. This section talks about enabling and using cache in different modes.

2.1 Setting up the cache service:

Open Visual Studio. Open your existing cache cloud project, under the “Roles” folder, right click on a role select its properties.

Here you can find a new “Caching” Tab (This would show up only if 1.7 SDK is installed). It looks like below.

In co-located role model, enable cache for your role of interest by selecting “Enable Caching” in the “Caching” tab. Once you have enabled cache, you can choose what percentage of memory on your role can be used by cache by choosing “Co-located role” option and accordingly adjusting the size as per your requirements.

In order to have a dedicated role, right click on the “Roles” section -> Add -> “New Worker Role”. Choose a “Cache Worker Role” to add to the project. In the newly added Cache Worker Role’s properties, go to Caching tab and you can observe “Dedicate Role” option is selected.

Note: Before you deploy your application to Azure, specify Storage account credentials. Clicking on the button to specify storage account brings up this dialog box where you can specify the storage account where cache runtime information is logged.

2.2 Creating caches:

When you enable caching, by default you will have a “default” cache created. New named caches can be created quickly in the Visual Studio IDE. To create named caches, go to the “Caching” Tab (as mentioned in the previous section), under the “Named Cache Settings” click on “Add Named Cache” to add a cache, clicking on “Remove Named Cache” would remove a selected cache from the list. Cache settings are defined at the end of the section.

2.3 Consuming the cache from client application

Now let’s look into what changes would be required to your existing application’s client configuration to use dedicated cache models. A pre-requisite for the next steps is to install nuget

1. Firstly remove the following dlls from your client role references

–        Microsoft.ApplicationServer.Caching.Client
–        Microsoft.ApplicationServer.Caching.Core
–        Microsoft.Web.DistributedCache
–        Microsoft.WindowsFabric.Common
–        Microsoft.WindowsFabric.Data.Common

2. Right click on cache client role, select “Manage Nuget Packages”. Search for “Windows Azure Caching Preview” package and Install the package. This would add all the necessary dlls to your references.

3. If the client configuration is in app.config, below changes are required.

This is what is currently present in your app.config.

<configSections>
<section name=”dataCacheClient type=”Microsoft.ApplicationServer.Caching.DataCacheClientSection, Microsoft.ApplicationServer.Caching.Core” allowLocation=”true” allowDefinition=”Everywhere”/>
</configSections>
<dataCacheClient>
<hosts>
<host name=”cacheName.cache.windows.net” cachePort=”22233″ />
</hosts>
<securityProperties mode=”Message”>
<messageSecurity authorizationInfo=”ACSToken”>
</messageSecurity>
</securityProperties>
</dataCacheClient>

This is what you need to update to

<configSections>
<section name=”dataCacheClients” type=”Microsoft.ApplicationServer.Caching.DataCacheClientsSection, Microsoft.ApplicationServer.Caching.Core” allowLocation=”true” allowDefinition=”Everywhere” />
</configSections> 
<dataCacheClients>
<tracing sinkType=”DiagnosticSink” traceLevel=”Error” />
<dataCacheClient name=”default”>
<autoDiscover isEnabled=”true” identifier=”[cache cluster role name]” />
</dataCacheClient>
</dataCacheClients>

Notice the difference in the above configurations. The words that are bolded above indicate that the section name in configSection and the data cache client(s) tag name should be same.

You would no longer need <hosts> and <securityProperties>.Instead of <hosts> you need to mention an identifier, which is the role name in which the cache enabled.

You wouldn’t need any security settings here because your application and cache service are in the same deployment and hence secured.

4. If client configuration is programmatic, do the below changes

Your existing code:

string hostName = “[Cache endpoint without port]”;
int cachePort;
cachePort = sslEnabled ? 22243 : 22233; // Default port
List<DataCacheServerEndpoint> server = new List<DataCacheServerEndpoint>();
server.Add(new DataCacheServerEndpoint(hostName, cachePort));
DataCacheFactoryConfiguration config = new DataCacheFactoryConfiguration();
string authenticationToken = “[InsertAuthenticationTokenHere]”;
SecureString secureAuthenticationToken = GetSecureString(authenticationToken);
config.SecurityProperties = new DataCacheSecurity(secureAuthenticationToken, sslEnabled);
config.Servers = server;
DataCacheFactory myCacheFactory = new DataCacheFactory(config);
myDefaultCache = myCacheFactory.GetDefaultCache();

Below is what you need to update to:

To continue with Default cache:

Co-located mode:

DataCacheFactoryConfiguration cfg = new DataCacheFactoryConfiguration();
cfg.AutoDiscoverProperty = new DataCacheAutoDiscoverProperty(true);
DataCacheFactory dcf = new DataCacheFactory(cfg);
DataCache dc = dcf.GetDefaultCache();

Dedicated mode:

DataCacheFactoryConfiguration cfg = new DataCacheFactoryConfiguration();
cfg.AutoDiscoverProperty = new DataCacheAutoDiscoverProperty(true,”[cache role name]”);
DataCacheFactory dcf = new DataCacheFactory(cfg);
DataCache dc = dcf.GetDefaultCache();

In order to use named caches, everything remains the same except DataCache object initialization which would be:

DataCache dc = dcf.GetCache(“[namedCache]”);

As mentioned earlier, you can observe that DataCacheServerEndpoint and DataCacheSecurity are not required in dedicated Cache, instead of these we used AutoDiscoveryProperty.

2.4 Enabling Diagnostics for the cache service:

Cache Diagnostics, enabling crash dumps and performance counters for cache service would be discussed in a different article that talk about troubleshooting tips and tricks.

2.5 Cache Settings description:

Name – indicates the cache name

Backup copies – is to enable High Availability which would create a copy of the data on the number of nodes specified

Notifications – will enable notifications for this cache

Eviction Policy – specifies the policy to use when you want to evict certain objects from cache

Time To Live – default time for an object to live in cache

Expiration Type – indicated the expiration type for an object in cache.

3. Advanced cache configuration Options:

Enabling Local cache:

–        with  Notifications:

  • The local cache items get invalidated using notification based mechanism. In order to have notifications support, the cache should be enabled with notifications while creating a named as mentioned in the earlier section.

<dataCacheClient name=”default”>
<localCache isEnabled=”true” sync=”NotificationBased” objectCount=”100000″ ttlValue=”300″ />
</dataCacheClient>

–        with TTL:

  • The local cache items get invalidated using TTL based mechanism here

<dataCacheClient name=”default”>
<localCache isEnabled=”true” sync=”TimeoutBased” objectCount=”100000″ ttlValue=”300″ />
</dataCacheClient>

4. Simulation in Development Fabric

Development fabric (Dev fabric) can be used for simulation before deploying to Azure. Start debugging your project in Visual Studio for a Dev fabric run. The cache server process for Dev fabric is emulated on the machine in this case.

To exit Dev fabric gracefully, use “Exit” from its menu options or from system tray instead of killing any process.

5. Deploy Application to Azure

Now you are ready to deploy your application to Azure. Right click on the cloud project, you can find option to either “Publish” the project to one of your slots or create a “Package” and upload the .cspkg and .cscfg from the Azure portal.

 
 

Tags: , ,