Summary of Microsoft Connect 2015

I try to put together most of the Microsoft technology announcements from Connect 2015, some of the news where updated at 30 November with RTM releases of .NET 4.6.1, NET Core and VS 2015 Update 1 .

Brian post some of them also here – http://blogs.msdn.com/b/bharry/archive/2015/11/30/vs-2015-update-1-and-tfs-2015-update-1-are-available.aspx 

.NET Team – http://blogs.msdn.com/b/dotnet/archive/2015/12/01/the-week-in-net-12-1-2015.aspx 

below…

  • Visual Studio Code beta release, Added extensibility support, open source project
  • .NET Core 5 RC and ASP.NET 5 RC with Go-Live license, can start using it in production – now is RTM (update DEC.)
  • Visual Studio Online is now Visual Studio Team Services, agile team collaboration and DevOps
  • Visual Studio Dev Essentials, priority forum support, Pluralsight, Wintellect,, Xamarin, (Azure early 2016)
  • Visual Studio cloud subscriptions
    • Monthly subscriptions include the VS Pro or Enterprise IDE, access to VSTS
    • Annual subscriptions includes technical support incidents, Azure credits, etc
  • Visual Studio Marketplace, central place to find, acquire, install extensions for all editions of Visual Studio
  • VS 2015 Update 1 and TFS 2015 Update 1, will both happen on November 30th.
  • Xamarin 4 support. end-to-end solution to build, test, monitor native mobile apps with VS2015 Update 1
  • iOS Build with MacinCloud on VSTS, currently in preview at $30/month per agent with no limits on build hours

other Announcements

Read all the details here:

There are also more than 70 on-demand videos with additional details on: connect2015

Advertisements

More MS OpenSource and Multi-Platform tools , .Net Core is GoLive

@Connect 2015 for few hours Microsoft has announced a lot of new things , here are

CONNECT 2015 KEYNOTES AND VIDEOS

You can watch everyone’s talks here

Enjoy DEV Smile

Code Quality improvements with MSbuild & Team Build – SonarQube integration

Technical debt is the set of problems in a development effort that make forward progress on customer value inefficient.  Technical debt saps productivity by making code hard to understand, fragile, difficult to validate, and creates unplanned work that blocks progress. Technical debt is insidious.  It starts small and grows over time through rushed changes, lack of context and lack of discipline. Organizations often find that more than 50% of their capacity is sapped by technical debt.

image

SonarQube is an open source platform that is the de facto solution for understanding and managing technical debt.

Customers have been telling us and SonarSource, the company behind SonarQube, that the SonarQube analysis of .Net apps and integration with Microsoft build technologies needs to be considerably improved.

Over the past few months we have been collaborating with our friends from SonarSource and are pleased to make available a set of integration components that allow you to configure a Team Foundation Server (TFS) Build to connect to a SonarQube server and send the following data, which is gathered during a build under the governance of quality profiles and gates defined on the SonarQube server.

  • results of .Net and JavaScript code analysis
  • code clone analysis
  • code coverage data from tests
  • metrics for .Net and JavaScript

We have initially targeted TFS 2013 and above, so customers can try out these bits immediately with code and build definitions that they already have. We have tried using the above bits with builds in Visual Studio Online (VSO), using an on-premises build agent, but we have uncovered a bug around the discovery of code coverage data which we are working on resolving. When this is fixed we’ll send out an update on this blog. We are also working on integration with the next generation of build in VSO and TFS.

In addition, SonarSource have produced a set of .Net rules, written using the new Roslyn-based code analysis framework, and published them in two forms: a nuget package and a VSIX. With this set of rules, the analysis that is done as part of build can also be done live inside Visual Studio 2015, exploiting the new Visual Studio 2015 code analysis experience

The source code for the above has been made available at https://github.com/SonarSource, specifically:

We are also grateful to our ever-supportive ALM Rangers who have, in parallel, written a SonarQube Installation Guide, which explains how to set up a production ready SonarQube installation to be used in conjunction with Team Foundation Server 2013 to analyse .Net apps. This includes reference to the new integration components mentioned above.

imageimage

This is only the start of our collaboration. We have lots of exciting ideas on our backlog, so watch this space.

As always, we’d appreciate your feedback on how you find the experience and ideas about how it could be improved to help you and your teams deliver higher quality and easier to maintain software more efficiently.

If you have any technical issues then please make your way over to http://stackoverflow.com, tagging your questions with sonarqube, and optionally tfs, c#, .net etc. For the current list of sonarqube questions seehttp://stackoverflow.com/questions/tagged/sonarqube.

Smart Unit Testing With Visual Studio 2015

Smart Testing (former pex & moles ) has a new name now – IntelliTest and explores your .NET code to generate test data and a suite of unit tests. For every statement in the code, a test input is generated that will execute that statement. A case analysis is performed for every conditional branch in the code. For example, if statements, assertions, and all operations that can throw exceptions are analyzed. This analysis is used to generate test data for a parameterized unit test for each of your methods, creating unit tests with maximum code coverage. Then you bring your domain knowledge to improve these unit tests.

When you run IntelliTest, you can easily see which tests are failing and add any necessary code to fix them. You can select which of the generated tests to save into a test project to provide a regression suite. As you change your code, rerun IntelliTest to keep the generated tests in sync with your code changes.

Get started with IntelliTest

You must use Visual Studio Enterprise. Sad smile

Explore: Use IntelliTest to explore your code paths and generate test data
  1. Open your solution in Visual Studio. Then open the class file that has methods you want to test.
  2. Right-click in a method in your code and choose Run IntelliTest to generate unit tests for all the code paths in your method.

    Right-click in your method to generate unit tests

    A parameterized unit test is generated for this method. The test data is created to exercise the code paths in the method. IntelliTest runs your code many times with different inputs. Each run is represented in the table showing the input test data and the resulting output or exception.

    Exploration Results window is displayed with tests

    To generate unit tests for all the public methods in a class, simply right-click in the class rather than a specific method. Then choose Run IntelliTest. Use the drop-down list in the Exploration Results window to display the unit tests and the input data for each method in the class.

    Select the test results to view from the list

    For tests that pass, check that the reported results in the result column match your expectations for your code. For tests that fail, fix your code and add exception handling if necessary. Then rerun IntelliTest to see if your fixes generated more test data from different code paths.

Persist: Save test data and unit tests as a regression suite
  • Select the data rows that you want to save with the parameterized unit test into a test project.

    Select tests; right-click and choose Save

    You can view the test project and the parameterized unit test that has been created with a PexMethod attribute. (The individual unit tests, corresponding to each of the rows, are saved in the .g.cs file in the test project.) The unit tests are created using the Visual Studio test framework, so you can run them and view the results from Test Explorer just as you would for any unit tests that you created manually.

    Open class file in test method to view unit test

    Any necessary references are also added to the test project.

    If the method code changes, rerun IntelliTest to keep the unit tests in sync with the changes.

Assist: Use IntelliTest to find issues in your code
  1. If you have more complex code, IntelliTest can help you discover any issues for unit testing. For example, if you have a method that has an interface as a parameter and there is more than one class that implements that interface. After you run IntelliTest, warnings are displayed for this issue. View the warnings to decide what you want to do.

    Right-click method and choose Smart Unit Tests

     

  2. After you investigate the code and understand what you want to test, you can fix a warning to choose which classes to use to test the interface.

    Right-click the warning and choose Fix

    This choice is added into the PexAssemblyInfo.cs file.

    [assembly: PexUseType(typeof(Camera))]

     

  3. Now you can rerun IntelliTest to generate a parameterized unit test and test data just using the class that you fixed.

    Rerun Smart Unit Tests to generate the test data

Q & A

Q: Can you use IntelliTest for unmanaged code?

A: No, IntelliTest only works with managed code, because it analyzes the code by instrumenting the MSIL instructions.

Q: When does a generated test pass or fail?

A: It passes like any other unit test if no exceptions occur. It fails if any assertion fails, or if the code under test throws an unhandled exception.

If you have a test that can pass if certain exceptions are thrown, you can set one of the following attributes based on your requirements at the test method, test class or assembly level:

  • PexAllowedExceptionAttribute
  • PexAllowedExceptionFromTypeAttribute
  • PexAllowedExceptionFromTypeUnderTestAttribute
  • PexAllowedExceptionFromAssemblyAttribute

Q: Can I add assumptions to the parameterized unit test?

A: Yes, use assumptions to specify which test data is not required for the unit test for a specific method. Use the PexAssume class to add assumptions. For example, you can add an assumption that the lengths variable is not null like this.

PexAssume.IsNotNull(lengths);

If you add an assumption and rerun IntelliTest, the test data that is no longer relevant will be removed.

Q: Can I add assertions to the parameterized unit test?

A: Yes, IntelliTest will check that what you are asserting in your statement is in fact correct when it runs the unit tests. Use the PexAssert class to add assertions. For example, you can add an assertion that two variables are equal.

PexAssert.AreEqual(a, b);

If you add an assertion and rerun IntelliTest, it will check that your assertion is valid and the test fails if it is not.

Q: What testing frameworks does IntelliTest support?

A: Currently only mstest is supported.

Q: Can I learn more about how the tests are generated?

A: Yes, read this blog post about the model and process.

Free Visual Studio 2013 Community Edition and difference with Express Editions

First of all, both are free!  Visual Studio Community Edition is newly released today.

Are two main differences between Visual Studio 2013 Community Edition and the Express Editions

1) Visual Studio Express Editions are targeting specific platforms:  Express for Web allows you to develop Web apps;  Express for Windows allows you to develop Windows apps; Express for Windows Desktop allows you to develop desktop apps.  But with Visual Studio Community Edition, you can develop projects targeting cross-platforms.

2) Visual Studio Express Editions do not allow users to use extensions (aka. plugins).  There are over 5000 great plugins for Visual Studio in Visual Studio Gallery.  Plugins such as Developer Assistant can boost developers’ productivity.  Unfortunately, they are not available to Visual Studio Express users.   With Visual Studio Community Edition, you can access and use All !

Visual Studio Community Edition may retire Visual Studio Express Editions in future, but this is not decided yet.

How to Diagnose .NET Memory Issues in Production using Visual Studio

One of the issues that frequently affects .NET applications running in production environments is problems with their memory use which can impact both the application and potentially the entire machine. To help with this, we’ve introduced a feature in Visual Studio 2013 to help you understand the .NET memory use of your applications from .dmp files collected on production machines. In this post, I’ll first discuss common types of memory problems and why they matter. I’ll then walk through how to collect data, and finally describe how to use the new functionality to solve memory problems in your applications.

Why worry about memory problems

.NET is a garbage collected runtime, which means most of the time the framework’s garbage collector takes care of cleaning up memory and the user never notices any impact. However, when an application has a problem with its memory this can have a negative impact on both the application and the machine.

  1. Memory leaks are places in an application where objects are meant to be temporary, but instead are left permanently in memory. In a garbage collected runtime like .NET, developers do not need to explicitly free memory like they need to do in a runtime like C++. However the garbage collector can only free memory that is no longer being used, which it determines based on whether the object is reachable (referenced) by other objects that are still active in memory. So a memory leak occurs in .NET when an object is still reachable from the roots of the application but should not be (e.g. a static event handler references an object that should be collected). When memory leaks occur, usually memory increases slowly over time until the application starts to exhibit poor performance.  Physical resource leaks are a sub category of memory leaks where a physical resource such as a file, or OS handler is accidentally left open or retained. This can lead to errors later in execution as well as increased memory consumption.
  2. Inefficient memory use is when an application is using significantly more memory than intended at any given point in time, but the memory consumption is not the result of a leak. An example of inefficient memory use in a web application is querying a database and bringing back significantly more results than are needed by the application.
  3. Unnecessary allocations. In .NET, allocation is often quite fast, but overall cost can be deceptive, because the garbage collector (GC) needs to clean it up later. The more memory that gets allocated, the more frequently the GC will need to run. These GC costs are often negligible to the program’s performance, but for certain kinds of apps, these costs can add up quickly and make a noticeable impact to the performance of the app

If an application suffers from a memory problem, there are three common symptoms that may affect end users.

  1. The application can crash with an “Out of Memory” exception. This is a relatively common problem for 32bit applications because they are limited to only 4GB of total Virtual Address Space. It is however less common for 64bit applications because they are given much higher virtual address space limits by the operating system.
  2. The application will begin to exhibit poor performance. This can occur because the garbage collector is running frequently and competing for CPU resources with the application, or because the application constantly needs to move memory between RAM (physical memory) and the hard drive (virtual memory); which is called paging.
  3. Other applications running on the same machine exhibit poor performance. Because the CPU and physical memory are both system resources, if an application is consuming a large amount of these resources, other applications are left with insufficient amounts and will exhibit negative performance.

In this post I’ll be covering a new feature added to Visual Studio 2013 intended to help identify memory leaks and inefficient memory use (the first two problem types discussed above). If you are interested in tools to help identify problems related to unnecessary allocations, see .NET Memory Allocation Profiling with Visual Studio 2012.

Collecting the data

To understand how the new .NET memory feature for .dmp files helps us to find and fix memory problems let’s walk through an example. For this purpose, I have introduced a memory leak when loading the Home page of a default MVC application created with Visual Studio 2013 (click here to download). However to simulate how a normal memory leak investigation works, we’ll use the tool to identify the problem before we discuss the problematic source code.

The first thing I am going to do is to launch the application without debugging to start the application in IIS Express. Next I am going to open Windows Performance Monitor to track the memory usage during my testing of the application. Next I’ll add the “.NET CLR Memory -> # Bytes in all Heaps” counter, which will show me how much memory I’m using in the .NET runtime (which I can see is ~ 3.5 MB at this point). You may use alternate or additional tools in your environment to detect when memory problems occur, I’m simply using Performance Monitor as an example. The important point is that a memory problem is detected that you need to investigate further.

HomePagewDefaultPerfMon

The next thing I’m going to do is refresh the home page five times to exercise the page load logic. After doing this I can see that my memory has increased from ~3.5 MB to ~13 MB so this seems to indicate that I may have a problem with my application’s memory since I would not expect multiple page loads by the same user to result in a significant increase in memory.

image

For this example I’m going to capture a dump of iisexpress.exe using ProcDump, and name it “iisexpress1.dmp” (notice I need to use the –ma flag to capture the process memory, otherwise I won’t be able to analyze the memory). You can read about alternate tools for capturing dumps in what is a dump and how do I create one?

image

Now that I’ve collected a baseline snapshot, I’m going to refresh the page an additional 10 times. After the additional refreshes I can see that my memory use has increased to ~21 MB. So I am going to use procdump.exe again to capture a second dump I’ll call “iisexpress2.dmp”

image

Now that we’ve collected the dump files, we’re ready to use Visual Studio to identify the problem.

Analyzing the dump files

The first thing we need to do to begin analysis is open a dump file. In this case I’m going to choose the most recent dump file, “iisexpress2.dmp”.

Once the file is open, I’m presented with the dump file summary page in Visual Studio that gives me information such as when the dump was created, the architecture of the process, the version of Windows, and what the version of the .NET runtime (CLR version) the process was running. To begin analyzing the managed memory, click “Debug Managed Memory” in the “Actions” box in the top right.

image

This will begin analysis

image

Once analysis completes I am presented with Visual Studio 2013’s brand new managed memory analysis view. The window contains two panes, the top pane contains a list of the objects in the heap grouped by their type name with columns that show me their count and the total size. When a type or instance is selected in the top pane, the bottom one shows the objects that are referencing this type or instance which prevent it from being garbage collected.

image

[Note: At this point Visual Studio is in debug mode since we are actually debugging the dump file, so I have closed the default debug windows (watch, call stack, etc.) in the screenshot above.]

Thinking back to the test scenario I was running there are two issues I want to investigate. First, 16 page loads increased my memory by ~18 MB which appears to be an inefficient use of memory since each page load should not use over 1 MB. Second, as a single user I’m requesting the same page multiple times, which I expect to have a minimal effect on the process memory, however the memory is increasing with every page load.

Improving the memory efficiency

First want to see if I can make page loading more memory efficient, so I’ll start looking at the objects that are using the most memory in the type summary (top pane) of memory analysis window.

Here I see that Byte[] is the type that is using the most memory, so I’ll expand the System.Byte[] line to see the 10 largestByte[]’s in memory. I see that this and all of the largest Byte[]’s are ~1 MB each which seems large so I want to determine what is using these large Byte[]’s. Clicking on the first instance shows me this is being referenced by aSampleLeak.Models.User object (as are all of the largest Byte[]’s if I work my way down the list).

image

At this point I need to go to my application’s source code to see what User is using the Byte[] for. Navigating to the definition of User in the sample project I can see that I have a BinaryData member that is of type byte[]. It turns out when I’m retrieving my user from the database I’m populating this field even though I am not using this data as part of the page load logic.

public class User : IUser
{

[Key]
public string Id { get; set; }

public string UserName { get; set; }

public byte[] BinaryData { get; set; }
}

Which is populated by the query

User user = MockDatabase.SelectOrCreateUser(
“select * from Users where Id = @p1”,
userID);

In order to fix this, I need to modify my query to only retrieve the Id and UserName when I’m loading a page, I’ll retrieve the binary data later only if and when I need it.

User user = MockDatabase.SelectOrCreateUser(
“select Id, UserName from Users where Id = @p1”,
userID);

Finding the memory leak

The second problem I want to investigate is the continual growth of the memory that is indicating a leak. The ability to see what has changed over time is a very powerful way to find leaks, so I am going to compare the current dump to the first one I took. To do this, I expand the “Select Baseline” dropdown, and choose “Browse…” This allows me to select “iisexpress1.dmp” as my baseline.

image

Once the baseline finishes analyzing, I have an additional two columns, “Count Diff” and “Total Size Diff” that show me the change between the baseline and the current dump. Since I see a lot of system objects I don’t control in the list, I’ll use the Search box to find all objects in my application’s top level namespace “SampleLeak”. After I search, I see thatSampleLeak.Models.User has increased the most in both size, and count (there are additional 10 objects compared to the baseline). This is a good indication that User may be leaking.

image

The next thing to do is determine why User objects are not being collected. To do this, I select the SampleLeak.Models.Userrow in the top table. This will then show me the reference graph for all User objects in the bottom pane. Here I can see thatSampleLeak.Models.User[] has added an additional 10 references to User objects (notice the reference count diff matches the count diff of User).

image

Since I don’t remember explicitly creating a User[] in my code, I’ll expand the reference graph back to the root to figure out what is referencing the User[]. Once I’ve finished expansion, I can see that the User[] is part of a List<User> which is directly being referenced by a root (the type of root is displayed in []’s next to the root type; in this case System.Object[] is a pinned handle)

image

What I need to do next is determine where in my application I have a List<> that User objects are being added to. To do this, I search for List<User> in my sample application using Edit -> Find -> Quick Find (Ctrl + F). This takes me to theUserRepository class I added to the application.

public static class UserRepository
{
//Store a local copy of recent users in memory to prevent extra database queries
static private List<User> m_userCache = new List<User>();
public static List<User> UserCache { get { return m_userCache; } }

public static User GetUser(string userID)
{

//Retrieve the user’s database record
User user = MockDatabase.SelectOrCreateUser(
“select Id, UserName from Users where Id = @p1”,
userID);

//Add the user to cache before returning
m_userCache.Add(user);
return user;
}
}

Note, at this point determining the right fix usually requires an understanding of how the application works. In the case of my sample application, when a user loads the Home page, the page’s controller queries the UserRepository for the user’s database record. If the user does not have an existing record a new one is created and returned to the controller. In myUserRepository I have created a static List<User> I’m using as a cache to keep local copies so I don’t always need to query the database. However, statics are automatically rooted, which is why the List<User> shows as directly referenced by a root rather than by UserRepository.

Coming back to the investigation, a review of the logic in my GetUser() method reveals that the problem is I’m not checking the cache before querying the database, so on every page load I’m creating a new User object and adding it to the cache. To fix this problem I need to check the cache before querying the database.

public static User GetUser(string userID)
{
//Check to see if the user is in the local cache
var cachedUser = from user in m_userCache
where user.Id == userID
select user;

if (cachedUser.Count() > 0)
{
return cachedUser.FirstOrDefault();
}
else
{
//User is not in the local cache, retrieve user from the database
User user = MockDatabase.SelectOrCreateUser(
“select * from Users where Id = @p1”,
userID);

//Add the user to cache before returning
m_userCache.Add(user);

return user;
}
}

Validating the fix

Once I make these changes I want to verify that I have correctly fixed the problem. In order to do this, I’ll launch the modified application again and after 20 page refreshes, Performance Monitor shows me only a minimal increase in memory (some variation is to be expected as garbage builds up until it is collected).

Just to definitely validate the fixes, I’ll capture one more dump and a look at it shows me that Byte[] is no longer the object type taking up the most memory. When I do expand Byte[] I can see that the largest instance is much smaller than the previous 1 MB instances, and it is not being referenced by User. Searching for User shows me one instance in memory rather than 20, so I am confident I have fixed both of these issues.

image