Overview Microsoft Technical Summit 2016 and .Net Developer Conference

I recently held a talk for http://www.dotnet-developer-conference.de/ About  – Microservices and Scalable Backends  with Azure Service Fabric, and also a Transition path from a legacy application to scalable application.

Slides: https://1drv.ms/b/s!Akicn3-kM3q2hcpDxdbumEC7gd9usw

Regarding Microsoft’s Technical Summit(); is an event mainly for developers, providing announcements round to Microsoft’s developer Stack.

Again, a sensation to the topic Microsoft and Linux: Microsoft joins the Linux Foundation and is a Board Member there in the future! At the same time, Google is member of the .NET Foundation!

Microsoft has unveiled a Visual Studio for Mac OS X. The new IDE is however no porting existing Visual Studio, which runs only on Windows, but only an extended version of the product beginning 2016 in relation to the acquisition by Xamarin with purchased Xamarin Studio, which in turn is based on the free Mono develop. The new IDE supports the first step in the Microsoft strategy “mobile first, cloud first” only the development by Xamarin apps for iOS and Android, as well as Web applications and REST based Web services with .NET core. The programming languages can choose the developer between c# and f #.

Visual Studio “15” is now called Visual Studio 2017 and is now available as a release candidate.

Visual Studio 2017 and Visual Studio for Mac now offer a graphical view of the preview for Xamarin forms. So far, developers could write XAML tags in the editor and only saw the result at run time. Graphic design by mouse, the developers of WindowsPresentation Foundation (WPF), Silverlight or Windows universal apps are used, is still not possible on a Xamarin Forms interface.

.NET core, ASP.NET core and entity framework core version 1.1 released.

ASP.NET 1.1 core should be 15 percent faster and equips numerous functions (URL rewriting, response caching, HTTP compression, new process host listener, use of view components per day helper, new locations for sensitive data) after.

In entity framework core 1.1, Microsoft provides some programming functions from the predecessor entity framework 6.1 again at the disposal, which were missing in entity framework core 1.0. This searches for objects in the cache with find(), explicitly load of related objects with load() method, renewed preloaded objects with reload()loading as well as the simple query of object content, old and new, with GetDatabaseValues() and GetModifiedProperties(). Also the mapping to simple fields instead ofproperties and the recovery of lost database connections (connection resiliency) to the Microsoft SQL Server or SQL azure is possible again. The ability to map objects to the memory-optimized tables from Microsoft SQL Server 2016 is brand new. In addition, the makers have simplified the API with the standard functions can be replaced by entity framework core by their own implementations.

.NET 1.1 core Linux Mint 18, 42.1, OpenSUSE come with macOS 10.12 and WindowsServer 2016 as operating systems added. Samsung delivers core for Tizen also .NET.A further step in the direction of cross-platform compatibility of Microsoft products.NET represents core for the Tizen operating system, the Samsung, since June 2016Member has developed the .NET Foundation, together with relevant Visual Studio Tools.

The Team Foundation Server “15” 2017 receives the version number like Visual Studio. He had previously reached the stage release candidate 2 is now available as RTM version available

The new Visual Studio Mobile Center is a cloud application, the hosted on Github source code for IOS and Android apps automatically translated at each commit, testson real hardware and distributed successfully tested app packets to beta testers. Also, a usage analysis and run-time failure is possible. The supported programming languages are swift, objective-C, Java and c# (Xamarin). Support for Cordova and Windows universal apps is planned.

SQL Server for Linux: In March, Microsoft announced to provide the SQL Server in the future also for Linux. There is now a preview version of the database server for RedHat Enterprise Linux, Ubuntu Linux, macOS and Windows, as well as docking available. A version for SUSE Linux Enterprise Server is to follow soon. The latest release is called SQL Server Community Technology Preview vNext and represents an evolutionof SQL Server 2016, with new features in addition to the platform independence were shown in the keynote, nor previously listed on the website.

For the SQL Server 2016, a Service Pack 1 available now, that contains significant improvements in the licensing model in addition to bug fixes.

Xamarin for Free with Visual Studio & Mono Open Source

Visual Studio includes now Xamarin with no extra cost , was announced yesterday at #Build2016 Conference , and that is a huge benefits for developers.

 

  The Xamarin platform will be in every edition of Visual Studio, including the widely-available Visual Studio Community Edition, which is free for individual developers, open source projects, academic research, education, and small  teams. Now we can Develop and publish native apps for iOS and Android with C# or F# from directly within Visual Studio with no limits on app size.

For Mac developers , Xamarin Studio is now available as a benefit of Visual Studio Professional or Enterprise subscriptions. Developers can use the newly-created Xamarin Studio Community Edition for free.

To give it a try and begin developing iOS and Android apps with the full version of Xamarin and C#, download Xamarin Studio or Xamarin for Visual Studio .

Another big announcement it is that The Mono Project it is added to the .NET Foundation, including some previously-proprietary mobile-specific improvements to the Mono runtime. Mono will also be re-released under the MIT License, to enable an even broader set of uses for everyone. More details to the Mono Project blog.

The changes to Mono remove all barriers to adopting a modern, performing .NET runtime in any software product, embedded device, or engine, and open the door to easily integrate C# with apps and games on iOS, Android, Mac, and Windows and any emerging platforms developers want to target in future.

How to Use TFS with Scaled Agile Framework

Using Team Foundation Server , Visual Studio,  to increase productivity and transparency into your application as well as increase the rate at which you can ship high quality software throughout the application lifecycle.

In the Following whitepaper is described How-TO: Using TFS to support epics, release trains, and multiple backlogs.

The Scaled Agile Framework, or SAFe, is  popular among organizations looking to scale Agile practices to the enterprise level. SAFe is a comprehensive framework, covering practices from portfolio level planning to release planning to coding practices.

While TFS does not provide full support for all SAFe practices, TFS can be used to implement many of the planning practices. This whitepaper also provides practical guidance on how to implement SAFe practices using TFS. It covers the following topics:

ALM_SAF_Overview1-4_Opt

The first two sections are conceptual and provide a quick overview of how TFS supports SAFe.The last two sections are guidance and provide detailed steps for the TFS Administrator to configure and customize TFS to support SAFe.

Mapping SAFe concepts to TFS concepts

  SAFe supports a portfolio view of multiple agile teams. SAFe illustrates how a portfolio vision is met by a hierarchy of teams, all of whom have their own specific objectives. This framework breaks down Epics into Features and Stories, which teams work on in Sprints and deliver through Program Increments (PIs) and Release Trains. Also, the portfolio backlog can track how deliverables map to Strategic Themes and associated budgets.

SAFe architectural overview © D. Leffing..

Image courtesy of Leffingwell, LLC.

The examples in this paper illustrate how to add the Epic WIT and backlog, configure a three-level team hierarchy, and map teams to their respective area and iteration paths. The examples build from the TFS Agile process template. However, the changes can be applied to any TFS process template.

TFS structure to support SAFe

SAFe Portfolios, Programs, and Teams map to TFS team projects and teams

Because TFS supports a hierarchical team structure, each team has its own view of their work which rolls up to the next level within the team hierarchy.

SAFe roles to TFS teams

image

In the section, “Customize TFS process to support SAFe”, details the changes to our Scrum, Agile, and CMMI process templates which enable SAFe support. The goal is not to create a SAFe Process Template, but modify existing process templates to enable SAFe practices. This changes are minimal and don’t encumber teams who choose not to use SAFe.

Now, you have the following options to update the templates to include these changes :

  1. You can download the standard Scrum, Agile, CMMI process templates with changes for SAFe here.
  2. If you have customized process templates, you can follow the instructions in the guidance. Additionally, in this blog post shows how to automate the process with PowerShell.

This whitepaper assumes a familiarity with the Scaled Agile Framework. If you’re familiar with Scrum but not familiar with SAFe, Inbar Oren has published these great videos which explain the basic SAFe concepts quickly.

Have fun with the process!

Migration from File Shares to Document Management (SharePoint 2013) using OneDrive (Skydrive) for Business

Some tips  How to migrate file shares to SharePoint and use OneDrive (SkyDrive) for Business (ODFB) and if you are planning to migrate file share content into SharePoint and want to make use of ODFB for synchronizing the SharePoint content offline.

Note: that these steps are both valid for SharePoint 2013 on-premises and SharePoint Online (SPO).

Info about SharePoint Limits

http://technet.microsoft.com/en-us/library/cc262787(v=office.15).aspx#Limits

First Step  – Analyze your File Shares

As a first step, try to understand the data that resides on the file shares. Ask yourself the following questions:

  • What is the total size of the file share data that the customer wants to migrate?
  • How many files are there in total?
  • What are the largest file sizes?
  • How deep are the folder structures nested?
  • Is there any content that is not being used anymore?
  • What file types are there?

Let me try to explain why you should ask yourself these questions.

Total Size

If the total size of the file shares are more that the storage capacity that you have on SharePoint, you need to buy additional storage (SPO) or increase your disk capacity (on-prem). To determine how much storage you will have in SPO, please check the Total available tenant storage in the tables in this article. Another issues that may arise is that in SharePoint is that you reach the capacity per site collection. For SPO that is 1000 Gigabyte (changed from 100 GB to 1 TB), for on-premises the recommended size per site collection is still around 200 Gigabyte.

What if we have more than 1000 Gigabyte?

  • Try to divide the file share content over multiple site collections when it concerns content which needs to be shared with others.
  • If certain content is just for personal use, try to migrate that specific content into the personal site of the user.

How Many Files

The total amount of files on the file shares is important as there are some limits in both SharePoint as well as ODFB that can result in an unusable state of the library or list within SharePoint but you also might end up with missing files when using the ODFB client.

First, in SPO we have a fixed limit of 5000 items per view, folder or query. Reasoning behind this 5000 limit boils all the way down to how SQL works under the hood. If you would like to know more about it, please read this article. In on-prem there is a way to boost this up, but it is not something we recommend as the performance can significantly decrease when you increase this limit.

Secondly for ODFB there is also a 5000 limit for synchronizing team sites and 20000 for synchronizing personal sites. This means that if you have a document library that contains more that 5000 items, the rest of the items will not be synchronized locally.

There is also a limit of 5 million items within a document library, but I guess that most customer in SMB won’t reach that limit very easily.

What should I do if my data that I want to migrate to a document library contains more than 5000 items in one folder?

  • Try to divide that amount over multiple subfolders or create additional views that will limit the amount of documents displayed.

But wait! If I already have 5000 items in one folder, doesn’t that mean that the rest of the other document won’t get synchronized when I use ODFB?

Yes, that is correct. So if you would like to use ODFB to synchronize document offline, make sure that the total amount of documents per library in a team site, does not exceed 5000 documents in total.

How do I fix that limit ?

  • Look at the folder structure of the file share content and see if you can divide that data across multiple sites and/or libraries. So if there is a folder marketing for example, it might make more sense to migrate that data into a separate site anyway, as this department probably wants to store additional information besides just documents (e.g. calendar, general info about the marketing team, site mailbox etc). An additional benefit of spreading the data over multiple sites/libraries is that it will give the ODFB users more granularity about what data they can take offline using ODFB. If you would migrate everything into one big document library (not recommended), it would mean that all users will need to synchronize everything which can have a severe impact on your network bandwidth.

Largest File Sizes

Another limit that exists in SPO and on-prem is the maximum file size. For both the maximum size per file is 2 Gigabyte. In on-prem the default is 250 MB, but can be increased to a maximum of 2 Gigabyte.

So, what if I have files that exceed this size?

  • Well, it won’t fit in SharePoint, so you can’t migrate these. So, see what type of files they are and determine what they are used for in the organization. Examples could be software distribution images, large media files, training courses or other materials. If these are still being used and not highly confidential, it is not a bad thing to keep these on alternative storage like a SAN, NAS or DVDs. If it concerns data that just needs to be kept for legal reasons and don’t require to be retrieved instantly, you might just put these on DVD or an external hard drive and store them in a safe for example.

Folder Structures

Another important aspect to look at on your file shares is the depth of nested folders and file length. The recommended total length of a URL in SharePoint is around 260 characters. You would think that 260 characters is pretty lengthy, but remember that URLs in SharePoint often has encoding applied to it, which takes up additional space. E.g. a space is one character but in Unicode this a %20, which takes up three characters. The problem is that you can run into issues when the URL becomes to large. More details about the exact limits can be found here, but as a best practice try to keep the URL length of a document under 260 characters.

What if I have files that will have more than 260 characters in total URL length?

  • Make sure you keep your site URLs short (the site title name can be long though). E.g. don’t call the URL Human Resources, but call it HR. If you land on the site, you would still see the full name Human Resources as Site Title and URL are separate things in SharePoint.
  • Shorten the document name (e.g. strip of …v.1.2, or …modified by Andre), as SharePoint has versioning build in. More information about versioning can be found here.

Content Idle

When migrating file shares into SharePoint is often also a good momentum to clean up some of the information that the organization has been collecting over the years. If you find there is a lot of content which is not been accessed for a couple of years, what would be the point of migrating that data it to SharePoint?

So, what should I do when I come across such content?

  • Discuss this with the customer and determine if it is really necessary to keep this data.
  • If the data cannot be purged, you might consider storing it on a DVD or external hard drive and keep it in a safe.
  • If the content has multiple versions, such as proposal 1.0.docx, proposal 1.1.docx, proposal final.docx, proposal modified by Andre.docx, you might consider just moving the latest version instead of migrating them all. This manual process might be time consuming, but can safe you lots of storage space in SharePoint. Versioning is also something that is build into the SharePoint system and is optimized to store multiple versions of the same document. For example, SharePoint only stores the delta of the next version, saving more storage space that way. This functionality is called Shredded Storage.

Types of Files

Determine what kind of files the customer is having. Are they mainly Office documents? If so, then SharePoint is the best place to store such content. However, if you come across developers code for example, it is not a good idea to move that into SharePoint. There are also other file extensions that are not allowed in SPO and/or on-prem. A complete list of blocked file types for both SPO and on-prem can be found here.

what if I come across such blocked file extensions?

  • Well, you can’t move them into SharePoint, so you should either ask yourself, do I still need these files? And if so, is there an alternative storage facility such as a NAS, I can store these files on? If it concerns developer code, you might want to store such code on a Team Foundation Service Server instead.

Tools for analyzing and fixing file share data

In order to determine if you have large files or exceed the 5000 limit for example, you need to have some kind of tooling. There are a couple of approaches here.

  • There is a PowerShell script that has been pimped up by  Hans Brender, which checks for blocked file types, bad characters in files and folders and finally for the maximum URL length. The script will even allow you to fix invalid characters and file extensions for you. It is a great script, but requires you to have some knowledge about PowerShell. Another alternative I was pointed at is a tool called SharePrep. This tool does a scan for URL length and invalid characters.
  • There are other 3rd party tools that can do a scan of your file share content such as Treesize. However such tools do not necessarily check for the SharePoint limitations we talked about in the earlier paragraphs, but at least they will give you a lot more insight about the size of the file share content.
  • Finally there are actual 3rd party migration tools that will move the file share content into SharePoint, but will check for invalid characters, extensions and URL length upfront. We will dig into these tools in Step 2 – Migrating your data.

Second Step – Migrating your data

So, now that we have analyzed our file share content, it is time to move them into SharePoint. There are a couple of approaches here.

Document Library Open with Explorer

If you are in a document library you can open up the library in the Windows Explorer. That way you can just do a copy and paste from the files into SharePoint.

image

Are some drawbacks using this scenario. First of all, I’ve seen lots of issues trying to open up the library in the Windows Explorer. Secondly, the technology that is used for copying the data into SharePoint is not very reliable, so keep that in mind when copying larger chunks of data. Finally there is also drag & drop you can use, but this is only limited to files (no folders) and only does a maximum of 100 files per drag. So this would mean if you have 1000 files, you need to drag them 10 times in 10 chunks. More information can be found in this article. Checking for invalid characters, extensions and URL length upfront are also not addressed when using the Open with Explorer method.

Pros: Free, easy to use, works fine for smaller amounts of data

Cons: Not always reliable, no metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

OneDrive (formerly SkyDrive) for Business

You could also use ODFB to upload the data into a library. This is fine as long as you don’t sync more than 5000 items per library. Remember though that ODFB is not a migration tool, but a sync tool, so it is not optimized for large chunks of data to be copied into SharePoint. Things like character and file type restrictions, path length etc. is on the list of the ODFB team to address, but they are currently not there.

The main drawbacks of using either the Open in Explorer option or using ODFB is that when you use these tools, they don’t preserve the metadata of the files and folder that are on the file shares. By this I mean, things like the modified date or owner field are not migrated into SharePoint. The owner will become the user that is copying the data and the modified date will be the timestamp of the when the copy operation was executed. So if this metadata on the files shares is important, don’t use any of the methods mentioned earlier, but use one of the third party tools below.

Pros: Free, easy to use, works fine for smaller amounts of data (max 5000 per team site library or 20000 per personal site)

Cons: No metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

3rd party tools

Here are some of the 3rd party tools that will provide additional detection, fixing and migration capabilities that we mentioned earlier:

Where some have a focus on SMB, while other more focused on the enterprise segment. We can’t speak out any preference for one tool or the other, but most of the tools will have a free trial version available, so you can try them out yourself.

Summary

When should I use what approach?

Here is a short summary of capabilities:

Open in Explorer
OneDrive for Business (with latest update)
3rd party

Amount of data
Relatively small
No more than 5000 items per library
Larger data sets

Invalid character detection
No
No
Mostly yes1

URL length detection
No
No
Mostly yes1

Metadata preservation
No
No
Mostly yes1

Blocked file types detection
No
No
Mostly yes1

1This depends on the capabilities of the 3rd party tool.

Troubleshooting

ODFB gives me issues when synchronizing data
Please check if you have the latest version of ODFB installed. There have been stability issues in earlier released builds of the tool, but most of the issues should be fixed by now. You can check if you are running the latest version, by opening up Word-> File-> Account and click on Update Options-> View Updates. If your current version number is lower than the one you have, click on the Disable Updates button (click yes if prompted), then click Enable updates (click yes if prompted). This will force downloading the latest version of Office and thus the latest version of the ODFB tool.

image

If you are running the stand-alone version of ODFB, make sure you have downloaded the latest version from here.

Latency Time by the upload , process is taking so long?
This really depends on a lot of things. It can depend on:

  • The method or tool that is used to upload the data
  • The available bandwidth for uploading the data. Tips:
  • Check your upload speed at http://www.speedtest.net and do a test for your nearest Office 365 data center. This will give you an indication of the maximum upload speed.
  • Often companies have less available upload bandwidth then people at home. If you have the chance, uploading from a home location might be faster.
  • Schedule the upload at times when there is much more bandwidth for uploading the data (usually at night)
  • Test your upload speed upfront by uploading maybe 1% of the data. Multiply it by 100 and you have a rough estimate of the total upload time.
  • The computers used for uploading the data. A slow laptop can become a bottle neck while uploading the data.

Visual Studio 2012 Update 2 Highlights with Chris Binder Live Stream

2012 is only in September last year appear final. There is the Visual Studio 2012 update 2 (VS 2012.2) as CTP will be available after the update 1 already. Tonight at 5 pm, Dariusz Parys and Christian Binder present their personal highlights of the second update in the live stream. There is also the opportunity to ask questions in the live chat to discuss.2012 is only in September last year appear final. There is the Visual Studio 2012 update 2 (VS 2012.2) as CTP will be available after the update 1 already. Tonight at 5 pm, Dariusz Parys and Christian Binder present their personal highlights of the second update in the live stream. There is also the opportunity to ask questions in the live chat to discuss.

More infos on Dariusz Parys Blog. Live Stream starts 17 h hier.

Tastenkombinationen für Windows 8 und Windows RT

 

In Windows 8 und Windows RT stehen Ihnen neben den bekannten auch neue Tastenkombinationen zur Verfügung. Die einfachste Möglichkeit, eine Suche auf dem Startbildschirm durchzuführen, besteht beispielsweise darin, einfach einen Suchbegriff einzugeben. Sie befinden sich nicht auf dem Startbildschirm? Dann drücken Sie einfach die Windows Logo-Taste? WindowsLogo-Taste, um schnell zwischen dem Startbildschirm und der gerade geöffneten App zu wechseln. Wenn Sie mit den Tastenkombinationen nicht vertraut sind — oder gerne eine Liste aller Tastenkombinationen anzeigen würden —, dann rufen Sie die komplette Liste der Tastenkombinationen ab.

Im Folgenden sind einige der wichtigsten Tastenkombinationen für Windows aufgeführt.

 

 

Tastenkombination Aktion
Windows-Logo-Taste‌ Windows-Logo-Taste+Eingabe
 

PC durchsuchen

STRG+Plus (+) oder STRG+Minus (-)

 

Verkleinern oder Vergrößern einer großen Anzahl von Elementen, z. B. von an die Startseite angehefteten Apps

STRG+Mausrad

 

Verkleinern oder Vergrößern einer großen Anzahl von Elementen, z. B. von an die Startseite angehefteten Apps

Windows-Logo-Taste‌ Windows-Logo-Taste+C
 

Charms öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+C
 

Befehle für die App öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+F
 

Charm „Suche“ zum Suchen von Dateien öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+H
 

Charm „Freigeben“ öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+I
 

Charm „Einstellungen“ öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+J
 

Haupt-App und angedockte App wechseln

Windows-Logo-Taste‌ Windows-Logo-Taste+K
 

Charm „Geräte“ öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+O
 

Bildschirmausrichtung sperren (Hoch- oder Querformat)

Windows-Logo-Taste‌ Windows-Logo-Taste+Q
 

Charm „Suche“ zum Suchen von Apps öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+W
 

Charm „Suche“ zum Suchen von Einstellungen öffnen

Windows-Logo-Taste‌ Windows-Logo-Taste+Z
 

In der App verfügbare Befehle anzeigen

Windows-Logo-Taste‌ Windows-Logo-Taste+Leertaste
 

Eingabesprache und Tastaturlayout wechseln

Windows-Logo-Taste‌ Windows-Logo-Taste+Strg+Leertaste
 

Zu einer zuvor ausgewählten Eingabe wechseln

Windows-Logo-Taste‌ Windows-Logo-Taste+Tabulator
 

In den geöffneten Apps blättern (Desktop-Apps ausgenommen)

Windows-Logo-Taste‌ Windows-Logo-Taste+STRG+TAB
 

In den geöffneten Apps blättern (Desktop-Apps ausgenommen) und Apps beim Blättern andocken

Windows-Logo-Taste‌ Windows-Logo-Taste+UMSCHALT+TAB
 

In umgekehrter Reihenfolge in den geöffneten Apps blättern (Desktop-Apps ausgenommen)

Windows-Logo-Taste‌ Windows-Logo-Taste+Bild-Auf
 

Startbildschirm und Apps auf den linken Monitor verschieben (Apps auf dem Desktop werden nicht auf den anderen Monitor verschoben)

Windows-Logo-Taste‌ Windows-Logo-Taste+Bild-Ab
 

Startseite und Apps auf den rechten Monitor verschieben (Apps auf dem Desktop werden nicht auf den anderen Monitor verschoben)

Windows-Logo-Taste‌ Windows-Logo-Taste+Umschalttaste+Punkt (.)
 

App links andocken

Windows-Logo-Taste‌ Windows-Logo-Taste+Punkt (.)
 

App rechts andocken

Windows 8 RTM

 

Windows 8 reached Release to Manufacturing, Windows 8 is now being issued to all PC OEM and manufacturing partners.

More details http://blogs.msdn.com/b/b8/archive/2012/08/01/releasing-windows-8-august-1-2012.aspx

  • August 15th: Developers will be able to download the final version of Windows 8 via your MSDN subscriptionsand DreamSpark Premium
  • August 15th: IT professionals testing Windows 8 in organizations will be able to access the final version of Windows 8 through your TechNet subscriptions.
  • August 16th: Education institutions with existing Microsoft Software Assurance for Windows will be able to download Windows 8 Enterprise edition through the Volume License Service Center (VLSC), allowing you to test, pilot and begin adopting Windows 8 Enterprise within your organization.
  • August 16th: Microsoft Partner Network members will have access to Windows 8.
  • August 20th: Microsoft Action Pack Providers (MAPS) receive access to Windows 8.
  • September 1st: Volume License customers without Software Assurance will be able to purchase Windows 8 through Microsoft Volume License Resellers and Academic License Resellers.

So over the next few days/weeks you will see the availability of exciting new models of PCs loaded with Windows 8 and online availability of Windows 8 on October 26, 2012.

More details http://windowsteamblog.com/windows/b/bloggingwindows/archive/2012/08/01/windows-8-has-reached-the-rtm-milestone.aspx

Also, Windows Server 2012 has been released to manufacturing.

On September 4.  That’s when Windows Server 2012 will be generally available for evaluation and download by all customers around the world.  On that day we will also host an online launch event where our executives, engineers, customers and partners will share more about how Windows Server 2012 can help organizations of all sizes realize the benefits of what we call the Cloud OS.  You will be able to learn more about the features and capabilities and connect with experts and peers.  You’ll also be able to collect points along the way for the chance to win some amazing prizes. You don’t want to miss it.  Visit this site to save the date for the launch event.

More details http://blogs.technet.com/b/windowsserver/archive/2012/08/01/windows-server-2012-released-to-manufacturing.aspx

What is the difference between task and thread in Managed Code?

 

Always that is an common question on using programming patterns and techniques.

A task is something you want doing.

A thread is one of possibly many workers who perform that task.

In .NET 4.0 terms, a Task represents an asynchronous operation. Thread(s) are used to complete that operation by breaking the work up into chunks and assigning to separate threads.

 

Using Asynchron  patterns in UI programming.

void async_uitasks()       
  {         
    // Nebenläufige Berechnung der Quadratwurzeln          
   Task.Factory.StartNew(() =>         
    {              
   double dSumme = 0.0;              
   for (int i = 0; i < 100000000; i++)               
  {                   
  dSumme += Math.Sqrt(i);               
  }               
  return dSumme;           
  }).ContinueWith(dTask =>       
      {                
 // Aufruf, wenn Berechnung beendet ist       
          sw.Stop();                
 // Sind wir im "richtigen" Thread?          
       if (textBoxResult.Dispatcher.CheckAccess())        
         {                
     // Ja, direkte Ausgabe der Ergebnisse                  
   UpdateUI(dTask.Result, sw.ElapsedMilliseconds, false);           
      }                 
else             
    {                    
 // Nein, Invoke benutzen            
         UpdateUIDelegate delUpdate = new UpdateUIDelegate(UpdateUI);                   
  textBoxResult.Dispatcher.Invoke(DispatcherPriority.Normal, delUpdate,          
             dTask.Result, sw.ElapsedMilliseconds, true);      
           }      
       });    
     }
 
NonBlocking tasks
 
   const int x = 3000;             const int y = 1000;          
   // Your scheduler            
 TaskScheduler scheduler = TaskScheduler.Default;           
  Task nonblockingTask = new Task(() =>          
   {                 
    CancellationTokenSource source = new CancellationTokenSource();               
  Task t1 = new Task(() =>               
  {                  
   while (true)       
     {                
         // Do something                      
    if (source.IsCancellationRequested)              
               break;                    
   }      
  }, source.Token);                
 t1.Start(scheduler);               
  // Wait for task 1             
    bool firstTimeout = t1.Wait(x);        
     if (!firstTimeout)                
    {              
     // If it hasn't finished at first timeout display message            
         Console.WriteLine("Message to user: the operation hasn't completed yet.");         
            bool secondTimeout = t1.Wait(y);    
                 if (!secondTimeout)             
        {               source.Cancel();      
                   Console.WriteLine("Operation stopped!");             
        }             
    }           
  });         
    nonblockingTask.Start();      
      Console.WriteLine("Do whatever you want...");            
 Console.ReadLine();        

Troubleshooting TFS 2010 reports & Warehouse

 

Background: Physical Architecture of TFS Reporting

Each TFS component maintains its own set of transaction databases. This includes work items, source control, tests, bugs, and Team Build. This data is aggregated into a relational database. The data is then placed in an Online Analytical Processing (OLAP) cube to support trend-based reporting and more advanced data analysis.

The TfsWarehouse relational database is a data warehouse designed to be used for data querying rather than transactions. Data is transferred from the various TFS databases, which are optimized for transaction processing, into this warehouse for reporting purposes. The warehouse is not the primary reporting store, but you can use it to build reports. The TfsReportDS data source points to the relational database. The Team System Data Warehouse OLAP Cube is an OLAP database that is accessed through SQL Server Analysis Services. The cube is useful for reports that provide data analysis of trends such as ‘how many bugs closed this month versus last month?’ The TfsOlapReportDS data source points to the Team System Data Warehouse OLAP cube in the analysis services database.

10 Steps to trouble shoot TFS Reporting

1. On the TFS Application tier server, open an Administrative Command Prompt

2. Run the following command: Net Stop TFSJobAgent

3. Once this completes, run the following command to restart the TFSJobAgent: Net Start TFSJobAgent

4. Open the TFS Administration console, and select the Reporting Node

5. Click the Start Rebuild link to rebuild the warehouse. Refresh this page until it displays “Configured and Jobs Enabled”

6. Open a web browser and navigate to the warehousecontrolservice.asmx page at:

http://<server>:8080/tfs/teamfoundation/administration/v3.0/warehousecontrolservice.asmx

7. Click ProcessWarehouse, then click Invoke on the subsequent page. This should return True.

8. Return to the WarehouseControlService.asmx page, then click ProcessAnalysisDatabase.

9. Enter Full for the processingType, then click Invoke, this should also return True.

10. Return to the WarehouseControlService.asmx page and click GetProcessingStatus, this should return the processing status page

with the current processing results. It should indicate Full Analysis processing is occurring. Refresh this page until the status

(ResultMessage) of the “Full Analysis Database Sync” indicates “Succeeded”

 

Refresh TFS Warehouse, Cube and Reports on demand

By default, TFS will process it’s Data Warehouse and Analysis Services Cube (and thus update the data for the reports) every 2 hours. Be careful with changing it to lower values than every hour:

Important

If you reduce the interval to less than the default of two hours (7200 seconds), processing of the data warehouse will consume server resources more frequently. Depending on the volume of data that your deployment has to process, you may want to reduce the interval to one hour (3600 seconds) or increase it to more than two hours. [Source: MSDN]

Alternatively you can use this small command line utility from Neno Loje:

Syntax/Usage:

tfsrefreshwarehouse.exe /server:http://servername:8080/tfs [/full] [/status]

Manually process the TFS Warehouse and Cube

Using just the /status paramter returns useful information about cube processing:

Using /status shows the last and next scheduled sync times

(Note: The user needs to have the ‘Administer Warehouse‘ permission in TFS)

Download the tool from here: TfsRefreshWarehouse.exe (.ZIP, 12,8 KB)

using TFS for beginners – common issues

 

Below are few issues which I guess one would run into on their first usage of TFS & Team Explorer, some of them are fixed in TFS 2010 and some other in TFS 11 Smiley

1) Permanently deleting Dummy Projects: After playing around for a while there would few dummy Team Projects created. By default TFS uses a soft delete. For permanent (hard) delete one can use tf command line utility with destroy option.

Note if you have deleted a project already you need to undelete it & check in pending changes. Destroy doesn’t work on deleted projects. Also the folder you are trying to delete should be mapped to a workspace (File -> Source Control ->Workspaces…)

E.g. tf destroy $/YourProjectName/SubFolder/… /collection:http://myserver:8080/tfs/defaultcollection (for other options check out tf destroy /?)

2) Logging in as a different user: By default VS.NET would ask you credentials to connect to TFS every time you run it. You can avoid it by caching required credentials. To cache your credentials go to Control Panel -> User Accounts -> Manage your network password (left column) -> Click Add to add the required details. Once added VS.NET won’t trouble you for credentials.

3) Deleting a workspace: Workspace belongs to a owner (authenticated user by TFS). Let’s say you have logged in as Admin & set your working folder to C:\WorkingFolder. Now you want to log on TFS as local user (without admin rights) & use the same mapped path (C:\WorkingFolder). TFS at this level would complain Admin is already using that location, hence you can’t use it. In order to remove workspace created by Admin you again need to fallback on tf utility.

E.g. tf workspace /delete MyWorkspace;MyDomain\Admin /server:http://MyTFS:8080/tfs

4) Automatic Check Out – not working: If you go to VS.NET Tools -> Options -> Source Control -> Environment, you will see 2 drop downs there. There is one which reads Editing – Check Out Automatically. What this means is when you have a opened project & you edit files using Solution Explorer those would be checked out automatically. But sometimes you won’t find it working. A possible reason, your solution is not binded to source control. To regain the bindings click on File -> Source Control -> Change Source Control (N.B. At this point of time Source Control explorer should be closed). There you will a list of your projects & solutions. Select them & click on bind button on the tool bar window. Things would start working now as expected. When you bind the TFS project and solution by default they will be checked out and 2 additional files vspscc and vssscc would be created. You need to check-in the project (.csproj) / solution (.sln) files to avoid rebinding of the solution next time. There isn’t a need to include vspscc and vssscc inside TFS.

5) Unlocking a file: 2 steps – find the workspace that belongs to the user and then execute undo command by specifying workspace, user account and file path –
tf workspaces /owner:domain\userid –you would get these parameters from the file lock message
tf undo /workspace:workspacename;domain\userid $/filePath –filePath you can copy it from File Properties in Source Control

6) Permanently deleting a WorkItem: There is quite a possibility there your team could create dummy workitems in process of getting familiar with the system. At times it might be important to cleam this items so that they don’t impact your charts and reports. Below is the command you can use to delete a workitem permanently.

witadmin destroywi /collection:CollectionURL /id:id [/noprompt]