Saturday, June 14, 2014

New Features in Visual Studio 2013 and .NET Framework 4.5.1

Visual Studio 2013 is the next-generation IDE for developers of Microsoft platform applications—Windows 8.x and .NET Framework. Visual Studio 2013 provides enhanced support for prototyping, designing, and modeling and improved testing tools that let developers build Windows, web, and cloud applications. In this article, I'll provide a high-level view of the top new features and enhancements in the Visual Studio 2013 IDE. I will also briefly discuss new features introduced in .NET Framework 4.5.1, which was released concurrently with Visual Studio 2013 in mid-October 2013.
Related: Microsoft Visual Studio 2013 Now Available for Download

Getting Started with Visual Studio 2013

To run Visual Studio 2013 on your system, you should have Windows 7 or later installed, preferably on a system with at least 4GB of RAM. You can learn more about the system requirements to install and run each edition of Visual Studio 2013 on the Visual Studio 2013 Compatibility page. To download Visual Studio 2013, go to the Visual Studio Downloads page. After you have downloaded Visual Studio 2013, you can start the installation after mounting the ISO file or unzipping the ISO file that you downloaded. Figure 1 shows how the Visual Studio 2013 IDE appears after it has been installed and configured (I've selected General Development Settings in my system).
Figure 1: Visual Studio 2013 Start Page
Figure 1: Visual Studio 2013 Start Page

Code Editor Improvements

A key improvement in the Visual Studio 2013 code editor is the new Peek Definition window. Peek Definition lets you stay in the context of the code while you browse through your code in the Code Editor window. You invoke Peek Definition either by right-clicking a symbol and clicking Peek Definition from the context menu or by pressing Alt+F12, as shown in Figure 2.
Figure 2: Accessing Peek Definition from the Code Editor
Figure 2: Accessing Peek Definition from the Code Editor
You'll then see the Peek Definition window, as shown in Figure 3.
Figure 3: Working in the Peek Definition Window in Visual Studio 2013
Figure 3: Working in the Peek Definition Window in Visual Studio 2013
Visual Studio 2013 provides greatly enhanced navigation in the Code Editor window. Key new navigation features include Auto Brace Complete, which automatically adds closing parentheses, quotes, braces, and brackets to code while you type it in the code editor; Move Line Up/Down, which lets you use Alt+ Up Arrow/Down Arrow to move one or multiple lines up or down; and an enhanced Navigate To feature with an optimized search capability that lets you type in either part of a symbol and find its definition or type in part of a filename and immediately switch to that file.
Support for JavaScript coding is also improved in Visual Studio 2013 in the form of a new identifier highlighting feature. This feature highlights all references to a variable or function call when you select the variable or the function call point in a source file.

Support for Building Windows 8.1 Store and Cloud Business Apps

Visual Studio 2013 includes project templates that let you create Windows Store applications for Windows 8.1, as well as maintain existing Windows 8 Store apps. Visual Studio 2013 also includes a new Cloud Business App template, powered by Visual Studio LightSwitch, which developers can use to build a SharePoint 2013 application that is integrated with Office 365. The cloud app can be hosted either in Windows Azure or on a Microsoft IIS server.

Better Performance, Debugging, and Optimization

Microsoft has introduced a number of performance, optimization, and debugging improvements in Visual Studio 2013. One noteworthy new feature, Memory Dump Analyzer, analyzes .NET memory heap dumps—a big help in diagnosing memory issues. Memory Dump Analyzer also helps you to detect and analyze memory usage over a period of time and memory leaks using dump (.dmp) files collected on production systems, such as the example shown in Figure 4. Note that this feature is available in Visual Studio 2013 Ultimate only; additionally, the process dump must be collected from a system that has .NET Framework 4.5 or later installed and running.
Figure 4: Viewing a .dmp File in Memory Dump Analyzer
Figure 4: Viewing a .dmp File in Memory Dump Analyzer
Visual Studio 2013 also provides much-improved support for diagnostics and a new testing feature. To perform memory profiling, you can now invoke the Performance Wizard, select the profiling method of your choice, and then start profiling. For testing, the new cloud-based load testing feature in Visual Studio Online lets you monitor the performance, scalability, and reliability of your applications deployed in the cloud.

TypeScript Support

In Visual Studio 2013, Microsoft has expanded support for TypeScript, an open source superset of JavaScript from Microsoft that's available on CodePlex. The TypeScript plug-in, which provides an editor and compiler for Visual Studio, has been updated to integrate with Visual Studio 2013 as well as Visual Studio 2012. You add a TypeScript file to a project in Visual Studio 2013 either through the Add, New Item dialog or by right-clicking in Solution Explorer and choosing Add, TypeScript File. When you add a TypeScript file, Visual Studio 2013 automatically configures the project to support TypeScript. Note that you can also rename your .js JavaScript files to .ts TypeScript files.
Microsoft continues to improve TypeScript's Visual Studio support. For example, the latest version of TypeScript, 0.9.5, provides a new project property page where you can configure a TypeScript project and lets you add a _references.ts file to a project.
RELATED: New Visual Studio 2013 Security Feature: ASP.NET Identity

Improved ALM Support

Visual Studio Team Foundation Server (TFS) 2013, Microsoft's application lifecycle management (ALM) collaboration solution for Visual Studio, includes a number of new features to improve version control, agile development and collaboration, project planning, testing, deployment, and other phases of the software development lifecycle. For example, the useful new work-item charting feature lets developers create diverse charts from work-item queries (e.g., bugs).
Microsoft's new release management product for TFS, Release Management for Visual Studio 2013 (formerly InRelease from InCycle Software), is now available for TFS 2013. Release Management for Visual Studio 2013 works with Visual Studio Ultimate 2013, Visual Studio Premium 2013, or Visual Studio Test Professional 2013 and comprises several components: Release Management Client for Visual Studio 2013, Release Management Server for TFS 2013, and Microsoft Deployment Agent 2013. Release Management for Visual Studio 2013 helps you to manage your application releases efficiently by creating reliable, repeatable automated deployment cycles.

New Features in .NET Framework 4.5.1

The most notable improvements in .NET Framework 4.5.1 are to its debugging and diagnostics capabilities. Here's a quick look at some of the new features:
  • Support for X64 Edit And Continue (EnC): The EnC feature, which lets you edit code while debugging in Visual Studio without stopping and restarting the debugging session, now is available for 64-bit environments.
  • Async/await debugging: This feature improves support for asynchronous debugging in Windows Runtime (WinRT). The Visual Studio 2013 debugger includes a new Tasks window, which replaces the Parallel Tasks window from earlier versions and lets developers view tasks in an application that represent asynchronous operations.
  • Performance improvements:
    • Multi-Core JIT: uses parallelization to reduce application startup time by decreasing the amount of time needed for just-in-time (JIT) compilation
    • On-demand compaction of large object heaps: provides support for compaction of large object heaps as part of a forced garbage collection operation
    • ASP.NET App Suspend: suspends websites that stay idle for a long time, thus reducing the consumption of resources and boosting the application's overall throughput
    • ADO.NET idle connection resiliency: rebuilds broken and/or idle connections to SQL Server databases automatically and transparently and works both in synchronous and asynchronous modes of operation.

New Tools, Better Productivity

This article only touched the surface of the extensive set of new capabilities that Visual Studio 2013 and .NET 4.5.1 offer Microsoft platform developers. I encourage you to upgrade to the latest versions of these tools and try out their many productivity-enhancing features.

Visual Studio Tips & Tricks

10 ways to get more productive in Visual Studio

Visual Studio is full of shortcuts and handy tools helping you, as a programmer, to be more efficient in your work. Today I’m sharing a few of my favourite ones. All examples given here are tested in Visual Studio 2012.
  1. Open search result in a list
    Press CTRL + F to open the search pane. Enter your search string and then next to the find button you can open alternative search ways and select Find all. Find all
    Besides taking you to the first occurrence of the text searched for, Visual Studio also opens a “Find Results” window (normally at the bottom) with all the occurrences of the search phrase.
    Find results
    In this window you can click through all locations and at the bottom you’ll also get some statistics of what has been found.
  2. Find active document in Solution Explorer
    Using menu options like “Go to definition” is great. But sometimes you’re taken to a document and you have no idea where the document belongs, especially if you have plenty of projects. By clicking on the Sync with Active Document button (see picture below), Solution Explorer will navigate to the location where the current document exists and mark it.
    Synchronize file views in Visual Studio
  3. Debugger.Break
    It might sound a bit strange to have a hard coded break in your code but it can be useful for testing purposes. Breakpoints are very handy but they can be easily turned off, while this line of code has to be removed to prevent debugger from stopping. An example is shown here:
    catch (Exception)
      // Please report to ... if your debugger breaks here
    The Debugger class can be found in the System.Diagnostics namespace.

  4. Cut the line
    If you mark a text and press CTRL + X that text will be cut. But if you don’t mark any text and press the same CTRL + X the whole line (including carriage return) will be cut. You can also press CTRL + L to cut the whole line without the carriage return.
  5. Move line up or down
    Hold down ALT while pressing up or down and you’ll move the current line up or down.
  6. Zoom
    Hold down CTRL while you’re using the wheel on your mouse and you can easily zoom in and out as needed.
  7. Auto format the document
    Are you facing a document with a lot of wrong indentations? Press CTRL + K, CTRL + D to format the whole document. If you only want to format a smaller selection then select the code and press CTRL + K, CTRL + F. If you open the menu Tools | Options | Text Editor you can adjust how Visual Studio is going to do this formatting.
  8. Select vertically and update all lines at once
    Hold down ALT while you’re selecting a code snippet. You’ll then be able to do a vertical selection instead of line by line selection. Funky, right?
    But it doesn’t stop there. You can also do a simultaneous update of all the lines you’ve marked. In the example below we’re able to change all variables at once from private to public.
  9. Scroll with arrow keys without moving marker
    Hold down CTRL while pressing up or down and you’ll scroll the window while keeping the marker at the same position. The effect is equal to using the scroll wheel on the mouse or the scroll bar at the side of the window, but you need the mouse to do both of them while this is a key-only option.
  10. View all open windows in a list
    Press CTRL + SHIFT + TAB to open this overview window. You can then move around using the TAB key or the arrows. Very convenient!

8 great features in Visual Studio 2013

In an earlier article I wrote about 10 ways to get more productive in Visual Studio. This is a follow up with new neat things coming with Visual Studio 2013.
  1. Open the Resolve menu when typing It happens once in a while that you haven’t included the namespace you need at the top of your file. One common example is Trace found in System.Diagnostics. If you write Trace, Visual Studio suggests TraceMode and other things, but no Trace. To quickly solve this, type Trace followed by CTRL + SPACE + . and the Resolve menu comes up where you can select to include using System.Diagnostics; to make Trace available. Very handy…1 - ctrl space dot
  2. Create a new Azure website from within VS There is no need anymore to open the Azure portal and create a website there before you publish it using Visual Studio. Now you can do it directly within. (You need to make sure you’ve installed Azure SDK 2.2 and can connect to Azure properly. You can check your connection to Azure by opening the Server Explorer window, click on Windows Azure and you should then be able to see your existing websites and other Azure things.)
    1. In the Publish Web dialog click on the Import button. 2 - 1 import profile
    2. In the next Import Publish Settings dialog click on New 2 - 2 new website
    3. In the final dialog Create a site on Windows Azure, enter the settings you want for your new website and click Create. 2 - 3 website settings
    4. Your publishing profile is now ready to be used and you can click on Publish to upload your website, without going through the Azure portal.
  3. Live debugging in Azure After you’ve published your website you can also, using live debugging, step through the executed code line by line as if it was on your local machine. Locate your published website under Server Explorer -> Windows Azure -> Web Sites. Right-click on your website and select Attach Debugger. Attach debugger
    Your Visual Studio is now connected to Azure and each breakpoint you put in the code will work against the live website.
  4. Live tracing from Azure In the same way you can do live debugging you can also make all trace messages, written in an Azure application, end up in your Visual Studio’s Output window. Add trace messages to your code simply by writing like this.
    Trace.WriteLine("This is my trace message");
    Don’t forget to re-publish your application if you make any changes to your code.
    Locate your website under Server Explorer as you did in the previous step. Right click on it and select View settings. Change Application logging (File system) to Verbose and click Save as done in the picture.
    Right click on the website again and this time select View streaming logs in Output window. In the Output window you can then see all the trace messages as they are being processed.
    Trace Message
    Don’t forget to turn off Verbose debugging after having done your testing. You might end up with huge amounts of log files otherwise.
  5. 64 bit Edit and Continue Have we been waiting for this one or what?!? Finally, by using .Net Framework 4.5.1, we can have the same edit and continue for 64 bits as we’ve been able to with 32 bit code for ages. So, good bye to this dialog! Edit and continue
  6. Return value inspection To help out the debugging process you can now easily see return values of functions being used as parameters in other function calls, for example if you nest several functions in each other.
    public partial class MainWindow : Window
            Random rnd = new Random();
            public MainWindow()
                Foo(Bar() * Bar());
            int Bar()
                return rnd.Next(1,100);
            void Foo(int p)
    After you’ve stepped over line 7 (the call to Foo) this is what you’ll see in the Autos window. We called Bar twice inside the calling to Foo and the results can be seen here.
  7. Just my code This feature tells the debugger to only step through the code you’ve been writing yourself and ignore frameworks and other code. The system is doing this by looking at open projects, .pbd files and program optimisations. For .Net Framework, this came before VS 2013, but what’s new now is that it’s available for C++ and JavaScript as well. To enable or disable Just My Code, open Debug -> Options and Settings -> General and change the value of Enable Just My Code. JustMyCode
  8. Peek a definition It’s now possible to open up a method definition without having to open that specific file. You can even open up a section located further up/down in your current file without having to leave the location you are at now. This feature is called Peek Definition and can be accessed through ALT + F12 or by right clicking the method and select Peek Definition. Here is an example where I’m peeking on the InitializeComponent() method. Peek Definition
    Note: If you, like I, have Telerik’s JustCode installed then ALT + F12 is connected to Find all references. To change, go to Tools -> Customize -> Keyboard and write PeekDefinition in the Show Commands Containing: textbox. Mark the Edit.PeekDefinition command and click in the Press shortcut keys: box. Press ALT + F12 and then click on Assign. Done! Telerik’s shortcut had to move for this new excellent feature!

Visual Studio 2013: A first look at Microsoft's sleek new IDE

Discover which features and enhancements were added in Visual Studio 2013 to increase developer productivity.

Visual Studio 2013 was a hot topic at Microsoft's TechEd conference earlier this year, but we had to wait until this month to get the final product. A new version of Microsoft's flagship IDE is a little irritating, as I just got fully comfortable with Visual Studio 2012.
I finally got a copy of Visual Studio 2013 and took it for a ride this week. Here's a quick tour of what Visual Studio 2013 has to offer; I focus on basic IDE enhancements, web development, productivity gains, and Windows-specific development.

New and improved interface

One of the first things you'll notice about Visual Studio 2013 is the revamped start page along with the ability to keep your development environment settings synchronized across the various devices you may use.
Figure A shows the start page for the initial loading of my copy of Visual Studio Ultimate 2013. This is accomplished by using your MSDN or Microsoft account -- you basically sign in, and your settings will be used every time you log on. The chore of keeping your home development machine in sync with your office laptop is greatly simplified and automated. Also, the logon handles product registration and so forth.
Figure A
The Visual Studio 2013 start page presented when the IDE is opened the first time.
One goal in the development of Visual Studio 2013 was developer productivity with an emphasis on putting more information at developers' fingertips without leaving the tool. A feature tied to this goal is CodeLens (Code Information Indicators). Basically, CodeLens allows you to view information on code changes: who made the changes, recent tests, code references, etc. within the code -- this assumes you're using Team Foundation Server (TFS). It gives you a quick summary without any additional work. While I'm not a TFS user, I'm always interested in identifying how often and where code is used, so the references features is nice to see on the fly.
Another great time-saving feature is Peek Definition (Figure B), which lets you view code definition inline without leaving your current place in the code. Simplifying code navigation was a goal with this release as was the ability to change scroll bar behavior within the code window -- there is bar and map mode. I prefer map mode, because it provides a live preview of code on which you hover.
Figure B
Peek Definition allows you to view source inline.
A quick stroll around Visual Studio 2013 reveals cosmetic changes like more intuitive icons and greater use of colors. Figure C demonstrates some of the icons and colors; I like the darker background, which is easier on my eyes. There are other themes available for background, toolbars, and so forth, or you can customize your own.
Figure C
A sleeker and more user-friendly interface in Visual Studio 2013.

Web development

Most of my projects these days involve a web interface, so I'm very interested in any tweaks with web development via Visual Studio 2013. One of the biggest changes is the Browser Link feature. The premise is you have multiple browsers open for testing web applications, and Browser Link keeps these browsers updated without you having to switch to the browser instance and manually refresh. This also meets the goal of improving developer productivity.
Another positive change is the ability to use multiple ASP.NET web page features: Web Forms, MVC, WCF, and so forth -- it's a step closer to the One ASP.NET goal. This is a big change from recent Visual Studio versions in which you have to pick one or another and stick with it. Now there is just one ASP.NET project type (Figure D). You can choose one type or include references to all (checkboxes) (Figure E), which is displayed after choosing ASP.NET project type.
Figure D
ASP.NET is the only web project type in Visual Studio 2013.
Figure E
Working with multiple ASP.NET page layouts in Visual Studio 2013.
There is full support of the latest and greatest web standards, with HTML5 and CSS3 completely integrated. There is full IntelliSense support for both even when typing CSS3 directly in HTML5 source. JavaScript is fully supported along with TypeScript.
You can download Web Essentials 2013 to add many common items for web developers like standard libraries and such. It's a head-scratcher why this wasn't included with the new product release as opposed to requiring another download; I guess Microsoft assumes not everyone is using the IDE for web development. It's worth noting that Bootstrap is a standard for ASP.NET applications.

Windows development and more

The Visual Studio 2013 release coincided with unleashing Windows 8.1 on the world, and the first can be used to build solutions for the latter. While I'm not a Windows Store application developer, it's nice to know the features are available when/if I enter that world; this includes IntelliSense for XAML.  There are a number of templates available online for building Windows Store applications.
There are new versions of the .NET Framework, along with ASP.NET included with Visual Studio 2013. Both are currently at version 4.5.1, so you'll need to spend some time becoming familiar with what has changed in these new versions.
There are multiple versions of the product, and they offer various features. Visual Studio Ultimate 2013 has the most features, including TFS support. Visual Studio Professional 2013 is on the bottom rung of the ladder. All of the versions have the same core feature set, with others added depending on your role -- like TFS support if working with a team that uses it. You can get more information on the different editions of Visual Studio 2013 on the Microsoft site.

Bottom line

At first glance, the sleek Visual Studio 2013 is nice on the eyes, it's easy to navigate, and you can get up and running in no time. I applaud Microsoft on the goal to help developers work smarter with less need to leave the IDE.
The product includes many more features than what I have room to cover here, including tools for cloud development via Azure, Application Lifecycle Management (ALM) features, Git support, and diagnostic tools. I've only had a short time to work with the new release, so while my initial impression is favorable, I'll have to employ it for a full project to determine whether the improvements are real or just window dressing.

New SQL Server 2014 Dynamic Management Views

SQL Server 2014 offers many new features and along with that comes many new tools to help manage these new features.  In this tip I will introduce the new Dynamic Management Views in SQL Server 2014.
Every SQL Server release comes with new and improved features and because of this, new Dynamic Management Views are included allowing us to collect metrics about these new features.
This version of SQL Server includes amongst its features:
  • The Hekaton Engine that brings us an optimistic latch free and lock free environment for In-Memory OLTP workloads.
  • Buffer Pool Extension that allows us to integrate a fast nonvolatile storage unit as an SSD disk to extend the SQL Server instance Buffer Pool and therefore reduce IO latency and increase transaction throughput.
Obviously these new features introduced a new set of DMVs which I will outline in this tip.

SQL Server Dynamic Management Views for Memory-Optimized Tables

The introduction of In Memory OLTP in SQL Server 2014 brings us a new category of DMVs that gives us the possibility to capture real time metrics of the Hekaton engine.  All Dynamic Management Views and functions that contain "xtp" (an acronym of eXtreme Transaction Processing) in its name refer to the Hekaton In-Memory OLTP Engine.
There are two types of In-Memory OLTP Dynamic Management Views:
  • Database specific: These DMV's give information and statistics of the current database. These start with "sys.dm_db_xtp_"
  • Instance specific: Returns information and statistics concerning the entire In-Memory OLTP Engine of the instance. These start with "sys.dm_xtp_"
Here is the list of DMVs:

Undocumented SQL Server Dynamic Management Views and Functions

  • sys.dm_db_xtp_nonclustered_index_stats: Displays statistics of Range Indexes in Memory-Optimized Tables.
  • sys.dm_db_xtp_object_stats: Reports row insert, update and delete attempts in Memory-Optimized tables
  • sys.dm_xtp_threads: Shows information about Hekaton threads like Base Address and thread type.
  • sys.dm_xtp_transaction_recent_rows: As its name says, returns information of recent rows within transactions.
  • sys.fn_dblog_xtp: Like sys.fn_dblog, displays transaction log information, but adds the following Hekaton specific columns:
Column Name
operation_desc nvarchar(30)
tx_end_timestamp bigint
total_size int
table_id bigint
newrow_identity bigint
newrow_data varbinary(8000)
newrow_datasize int
oldrow_begin_timestamp bigint
oldrow_identity bigint
oldrow_key_data varbinary(8000)
oldrow_key_datasize int
xtp_description nvarchar(1024)
  • sys.fn_dump_dblog_xtp: Same as above, but also works with backup devices, just like sys.fn_dump_dblog.

Operating System related SQL Server Dynamic Management Views

  • sys.dm_os_buffer_pool_extension_configuration: Returns configuration information about the buffer pool extension, a new feature of SQL Server 2014 that allows us to extend the buffer pool cache with nonvolatile storage like a SSD disk.

Execution Related SQL Server Dynamic Management Views

  • sys.dm_exec_query_profiles: The purpose of this DMV is to monitor in real time query profiles. In layman terms, when you execute a query with any profiling option you can watch its progress using this DMV.

I/O Related SQL Server Dynamic Management Views and Functions

  • sys.dm_io_cluster_valid_path_names: This is what this DMV returns.
Column Name
path_name nvarchar(256)
cluster_owner_node nvarchar(60)
is_cluster_shared_volume bit

SQL Server Resource Governor Dynamic Management Views

SQL Server AlwaysOn Availability Groups Dynamic Management Views and Functions

  • sys.fn_hadr_is_primary_replica: This function return 1 if the database on the current instance is the primary replica.
  • sys.dm_hadr_cluster: Returns information about the quorum of a Windows Server Failover Cluster on an AlwaysOn Availability Group or an AlwaysOn Failover Cluster Instance.
  • sys.dm_hadr_cluster_members: Shows information about Cluster Members.
  • sys.dm_hadr_cluster_networks: Returns network information about Windows Server Failover Cluster members participating in AlwaysOn Failover Cluster Instances or AlwaysOn Availability Groups.

Not a SQL Server DMV

  • sys.column_store_row_groups: Shows information about clustered columnstore indexes like row group state, total physical stored rows, including those marked as deleted, and deleted ones in other column. This DMV is useful to determine which row groups have a high percentage of deleted rows and should be rebuilt.

How to list all new SQL Server objects between versions

If you have installed both SQL Server 2012 and SQL Server 2014 then you can follow the next steps to list all new objects.

1 - Create a Linked server to the SQL Server 2014 instance.

USE [master]
EXEC master.dbo.sp_addlinkedserver @server       = N'SQL2014',
                                   @srvproduct   = N'SQL Server' 
/* For security reasons the linked server remote logins password is changed with ######## */ 
EXEC master.dbo.sp_addlinkedsrvlogin @rmtsrvname    = N'SQL2014',
                                     @useself       = N'False',
                                     @locallogin    = NULL,
                                     @rmtuser       = N'sa',
                                     @rmtpassword   = '########'

2 - Execute the script below

USE master 
        A.schema_id ,
        A.type ,
        A.type_desc , AS 'COLUMN NAME' ,
        C.column_id , ,
        C.max_length ,
        C.precision ,
        C.scale ,
FROM    SQL2014CTP1.master.sys.all_objects  A
        LEFT JOIN master.sys.all_objects B 
    ON =
        LEFT JOIN SQL2014CTP1.master.sys.all_columns C 
    ON A.object_id = C.object_id
         LEFT JOIN SQL2014CTP1.master.sys.types CT 
    ON C.system_type_id = CT.system_type_id
                    AND C.user_type_id = CT.user_type_id
    WHERE B.object_id IS null

What's new in SQL Server 2014? Is it worth the upgrade?

There is a great deal of information about the upcoming SQL Server 2014 version, especially about Hekaton which is the In-Memory OLTP engine feature of SQL Server 2014. But is Hekaton the only new feature in SQL Server 2014?  In this tip I will guide you through the new features that make SQL Server 2014 so outstanding.
Sometimes software vendors launch new versions of their products with minimal improvements with the sole purpose of maintaining visibility amongst competitors. Since migration of databases is a time consuming and expensive task, we as database professionals must decide what is best for our customers. That forces us to do some research about the new version's features in order to make the most accurate decision.
Let's take a look at the new features and improvements with SQL Server 2014.

SQL Server In-Memory OLTP Overview

SQL Server 2014 includes an In-Memory OLTP engine code named Hekaton. This engine provides a lock and latch free environment for OLTP workloads. It is fully integrated into SQL Server and accessed using standard T-SQL. Contrary to other products in the market, Hekaton is not a separate system, it is part of the SQL Server Database Engine.  Hekaton enables you to use both disk based tables and Memory-Optimized Tables together in the same queries and stored procedures.
Memory-Optimized tables can be defined as durable, so data persists on server restart and with schema only duration to preserve table definition alone, useful for ETL transformations and data staging.
Furthermore, Hekaton introduces Natively Compiled Stored procedures which are Transact-SQL Stored Procedures compiled to native code, to interact with Memory-Optimized Tables even more efficiently.
You can read more about Memory-Optimized tables and Natively Compiled Stored Procedures in my previous tips.

SQL Server 2014 Cloud Computing Enhancements

Since this version of SQL Server was designed with the premise of being a platform for a Hybrid Cloud it has some new and exciting features.
An on-premise SQL Server can have databases in which its data and log files are stored on Windows Azure Storage. This means that you can move your storage into the cloud while keeping all the transaction processing on your local server. Furthermore you can enable Transparent Data Encryption on databases while keeping the encryption key on the local server for added security.
You can deploy a SQL Server Database to a Windows Azure Virtual Machine with a few clicks with the SQL Server Management Studio Deploy a SQL Server Database to a Windows Azure Virtual Machine Wizard.
This release also includes the possibility to Backup and Restore to/from a URL directly with SQL Server Management Studio.

SQL Server AlwaysOn Improvements

As I told you before, this version of SQL Server was conceived as a platform for a Hybrid Cloud. So the Engineers at Microsoft had the wonderful idea of allowing Hybrid High Availability solutions like the creation of Azure replicas for Availability Groups with a simple Add Azure Replica Wizard to guide you through the process.
Also readable secondary replicas now remain available for reading on cluster quorum loss or when a primary replica is down.
Furthermore, the maximum number of replicas has been increased from 4 to 8.
SQL Server 2014 includes the possibility to use Cluster Shared Volumes as cluster shared disks in Windows Server 2012 and above on Failover Cluster Instances.
Even new Dynamic Management Views have been added to increase ease of troubleshooting. You can read about them on my previous tip "Understanding Dynamic Management Views in SQL Server 2014".

SQL Server Performance Enhancements

Several features have been added regarding performance.  Please see the following items below.

SQL Server Transactions with Delayed Durability

In order to reduce latency, transactions can be defined as delayed durable, which means that transaction returns control to the client before the Transaction Log record is written to disk.
This can be defined at the database level, COMMIT level, or ATOMIC block level in Natively Compiled Stored Procedures. Also the following Stored Procedure sys.sp_flush_log is included to flush the Transaction Log to disk in order to make previously committed transactions durable with delayed durability.
Here is a sample code.
// Set DB option to allow transactions with delayed durability.
USE [master]
  UPDATE dbo.SomeTable 
     SET SomeColumn = @SomeData
   WHERE SomePk = @SomeID
// set current transaction with delayed durability
// Flush transaction log to disk
     EXEC sys.sp_flush_log 

SQL Server Query Optimizer

SQL Server 2014 substantially improved the component of the engine that creates and optimizes query plans.  Stay tuned for more.

SQL Server Table and Index Operations

In SQL Server 2014 single partitions can be rebuilt and additional partition switching and index rebuild operations can be performed while the table is online.
Furthermore the ability to manage lock priority of online operations for tables and indexes has been added by allowing you to use WAIT_AT_LOW_PRIORITY option.  This option enables you to specify operation locks time maximum duration and abort conditions.  Here is some additional information:
MAX_DURATION Is the wait time in minutes the online operation will wait at low priority
ABORT_AFTER_WAIT These are the actions to be taken by the online operation when it is blocked beyond MAX_DURATION value.
NONE: Continue waiting for the lock with normal priority.
SELF: Leaves current operation without taking any action.
BLOCKERS: Kills transactions that block the online operation.
Here is a sample code.

SQL Server 2014 Incremental Option for CREATE STATISTICS

SQL Server 2014 permits statistics creation per partition by setting the INCREMENTAL option to ON in the CREATE STATISTICS statement. Here is an example.
ON dbo.SomeTable

SQL Server 2014 Buffer Pool Extension

This feature enables SQL Server to be configured to use a SSD disk as an extension for the Database Buffer Pool in order to reduce latency.

SQL Server 2014 Resource Governor

With the SQL Server 2014 release, we can set constraints on the physical IO operations.  The MAX_OUTSTANDING_IO_PER_VOLUME argument has been added to the ALTER RESOURCE GOVERNOR statement allowing us to set the maximum outstanding I/O operations per disk volume giving us the ability to tune the SQL Server instance IO according to the disk IO characteristics.
Also we can set the disk IO thresholds for disk volumes on Resource Pools with these two new settings: MAX_IOPS_PER_VOLUME and MIN_IOPS_PER_VOLUME.  These options set the maximum and minimum IO operations per second respectively.

SQL Server 2014 Columnstore Index Improvements

SQL Server 2014 has added updatable Clustered Columnstore Indexes.
Another feature is the capability to compress Columnstore indexes, both clustered and nonclustered even more. For this, two arguments have been added to the REBUILD option of the ALTER INDEX statement in order to handle Columnstore Indexes compression:
COLUMNSTORE Default Columnstore indexes Compression.
COLUMNSTORE_ARCHIVE Compress Columnstore indexes even more.
Also improvements in Batch processing mode have been made.  Stay tuned for more information.

SQL Server 2014 Security Enhancements

SQL Server 2014 includes the following security improvements: Backup Encryption and Permissions.

SQL Server 2014 Backup Encryption

Now SQL Server backup data can be encrypted during the backup creation with several new encryption algorithms like AES 128, AES 192, AES 256 and 3DES. I will cover this topic on a upcoming tip.

Permission Changes in SQL Server 2014

The following permission has been added in SQL Server 2014:
CONNECT ANY DATABASE Grants Connect permission for users on Databases that may be created in future.
IMPERSONATE ANY LOGIN Allows or block Login impersonation. Is useful to block impersonation from high privileged logins.
SELECT ALL USER SECURABLES Server level permission. Allows logins to perform SELECT Statements in all databases that the login has CONNECT permission.
ALTER ANY DATABASE EVENT SESSION Database level permission

Using CROSS APPLY to optimize joins on BETWEEN conditions

Recently I encountered a case when I knew much more about the data than the optimizer. Originally the performance was horrible, this is why I had to have a look at the query in the first place. When I was able to share my knowledge with the optimizer, it produced a better plan, and the query ran dramatically faster.

The slow query

The following tables store one-munite commercials for every minut for one year, and customer calls, one call per minute, for the same year. The scripts that populate tables with test data are provided at the end of this post. Here are the tables:

CREATE TABLE dbo.Commercials(
CommercialName VARCHAR(30) NOT NULL); GO CREATE TABLE dbo.Calls(CallID INT 
SomeInfo CHAR(300)); GO CREATE UNIQUE INDEX Calls_AirTime
ON dbo.Calls(AirTimeINCLUDE(SomeInfo); GO

Every commercial in my table lasts for at most one minute, and they do not overlap. I can easily enforce both conditions with constraints (
Storing intervals of time with no overlaps" ), which are omitted in this post just to keep it simple.
The following query retrieves only 181 rows, and it runs very slowly:
 SELECT s.StartedAts.EndedAtc.AirTime FROM dbo.Commercials s JOIN dbo.Calls c 
ON c.AirTime >= s.StartedAt AND c.AirTime s.EndedAt WHERE c.AirTime BETWEEN '20080701' AND '20080701 03:00'
 Table 'Calls'. Scan count 1, logical reads 11, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 2, logical reads 3338264, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Commercials'. Scan count 2, logical reads 7166, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

SQL Server Execution Times:
   CPU time = 71704 ms,  elapsed time = 36316 ms.
Why is it so slow? I haven't mastered the fine art of adding images to my posts yet, so I have to explain verbally. For every call the DB engine scans all the commercials which begin before the time of the call, which is expensive. The reason is simple: the optimizer does not know that the commercials are short, and that the commercials do not overlap, so it must scan all the potential matches, which are all the commercials which begin before the time of the call.

Using CROSS APPLY  to tell the optimizer that commercials do not overlap.

 Because commercials do not overlap, we need at most one match. Translating this information into plain SQL is easy, and the query runs dramatically faster:

SELECT s.StartedAts.EndedAtc.AirTime FROM dbo.Calls c CROSS APPLY(
SELECT TOP 1 s.StartedAts.EndedAt FROM dbo.Commercials s 
WHERE c.AirTime >= s.StartedAt AND c.AirTime s.EndedAt
ORDER BY s.StartedAt DESCAS s WHERE c.AirTime BETWEEN '20080701' AND '20080701 03:00'

Table 'Commercials'. Scan count 181, logical reads 1327, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Calls'. Scan count 1, logical reads 11, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

SQL Server Execution Times:
   CPU time = 31 ms,  elapsed time = 31 ms.

Note: if you needed only one column from Commercials table, you could easily use just a subquery. Because more than one column is needed, CROSS APPLY is a better, a more performant choice choice without redundant code.
Note: If you are using the assumption that the commercials do not overlap, you have to enforce that business rule in the database. Also to make sure that you don't forget that your query relies on that assumption, use a unit test do document it.
Also let me put it differently: If you are using the assumption that the commercials do not overlap, use a unit test do document it, so that that you don't forget that your query relies on that assumption. Also you have to enforce that business rule in the database.

Using another range condition to tell the optimizer that commercials are short.

 Because commercials are short, there is no need to scan the commercials that start more than maximum commercial's length before the call. Again, translating this information into plain SQL is quite easy too, and again the query runs much faster, even faster than the previous one:

SELECT s.StartedAts.EndedAtc.AirTime FROM dbo.Commercials s JOIN dbo.Calls c 
ON c.AirTime >= s.StartedAt AND c.AirTime s.EndedAt WHERE c.AirTime BETWEEN '20080701' AND '20080701 03:00' AND s.StartedAt BETWEEN '20080630 23:45' AND '20080701 03:00'

Table 'Worktable'. Scan count 1, logical reads 753, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Calls'. Scan count 1, logical reads 11, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Commercials'. Scan count 1, logical reads 4, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.

SQL Server Execution Times:
   CPU time = 31 ms,  elapsed time = 24 ms.

Note: If you are using the assumption that the commercials are short, you have to enforce that business rule in the database. Also to make sure that you don't forget that your query relies on that assumption, use a unit test do document it.
Also let me put it differently: If you are using the assumption that the commercials are short, use a unit test do document that. Also you have to enforce that business rule in the database. 

Setting up tables and test data

SELECT @i FROM dbo.Numbers;
SET @i @i 2; END; GO INSERT INTO dbo.Commercials(StartedAtEndedAtCommercialName) SELECT DATEADD(minute1'20080101')
'Show #'+CAST(AS VARCHAR(6))
FROM dbo.Numbers
WHERE n<=24*365*60; GO INSERT INTO dbo.Calls(CallID,
SomeInfo) SELECT 
'Call during Commercial #'+CAST(AS VARCHAR(6))
FROM dbo.Numbers
WHERE n<=24*365*60; GO


SQL Server 2005 introduced the APPLY operator, which is very much like a join clause and which allows joining between two table expressions i.e. joining a left/outer table expression with a right/inner table expression. The difference between join and APPLY operator becomes evident when you have a table-valued expression on the right side and you want this table-valued expression to be evaluated for each row from the left table expression. In this tip I am going to demonstrate what APPLY operator is, how it differs from regular JOINs and what are few of its applications.
The APPLY operator allows you to join two table expressions; the right table expression is processed every time for each row from the left table expression. As you might have guessed, the left table expression is evaluated first and then right table expression is evaluated against each row of the left table expression for final result-set. The final result-set contains all the selected columns from the left table expression followed by all the columns of right table expression.
The APPLY operator comes in two variants, the CROSS APPLY and the OUTER APPLY. The CROSS APPLY operator returns only those rows from left table expression (in its final output) if it matches with right table expression. In other words, the right table expression returns rows for left table expression match only.  Whereas the OUTER APPLY operator returns all the rows from left table expression irrespective of its match with the right table expression.  For those rows for which there are no corresponding matches in right table expression, it contains NULL values in columns of right table expression. So you might now conclude, the CROSS APPLY is semantically equivalent to INNER JOIN (or to be more precise its like a CROSS JOIN with a correlated sub-query) with a implicit join condition of 1=1 whereas OUTER APPLY is semantically equivalent to LEFT OUTER JOIN.
You might be wondering if the same can be achieved with regular JOIN clause then why and when to use APPLY operator? Though the same can be achieved with normal JOIN, the need of APPLY arises if you have table-valued expression on right part and also in some cases use of APPLY operator boost the performance of your query. Let me explain you with help of some examples.
Script #1 creates a Department table to hold information about departments. Then it creates an Employee table which hold information about the employees. Please note, each employee belongs to a department, hence the Employee table has referential integrity with the Department table.
Script #1 - Creating some temporary objects to work on...
USE [tempdb]
[Employee] END GO IF EXISTS (SELECT FROM sys.objects WHERE OBJECT_ID OBJECT_ID(N'[Department]') AND type IN (N'U')) BEGIN
[Department] END
[DepartmentID] [int] NOT NULL PRIMARY KEY,
ON [PRIMARY] INSERT [Department] ([DepartmentID][Name]VALUES (1N'Engineering') INSERT [Department] ([DepartmentID][Name]VALUES (2N'Administration') INSERT [Department] ([DepartmentID][Name]VALUES (3N'Sales') INSERT [Department] ([DepartmentID][Name]VALUES (4N'Marketing') INSERT [Department] ([DepartmentID][Name]VALUES (5N'Finance') GO CREATE TABLE [Employee](
[EmployeeID] [int] NOT NULL PRIMARY KEY,
[FirstName] VARCHAR(250) NOT NULL,
[LastName] VARCHAR(250) NOT NULL,
[DepartmentID] [int] NOT NULL REFERENCES [Department](DepartmentID),
INSERT [Employee] ([EmployeeID][FirstName][LastName][DepartmentID]) VALUES (1N'Orlando'N'Gee') INSERT [Employee] ([EmployeeID][FirstName][LastName][DepartmentID]) VALUES (2N'Keith'N'Harris') INSERT [Employee] ([EmployeeID][FirstName][LastName][DepartmentID]) VALUES (3N'Donna'N'Carreras') INSERT [Employee] ([EmployeeID][FirstName][LastName][DepartmentID]) VALUES (4N'Janet'N'Gates'
First query in Script #2 selects data from Department table and uses CROSS APPLY to evaluate the Employee table for each record of the Department table. Second query simply joins the Department table with the Employee table and all the matching records are produced.
WHERE E.DepartmentID D.DepartmentID
SELECT FROM Department D INNER JOIN Employee E ON D.DepartmentID 
cross apply and inner join result set
If you look at the results they produced, it is the exact same result-set; not only that even the execution plan for these queries are similar to each other and has equal query cost, as you can see in the image below. So what is the use of APPLY operator?  How does it differ from a JOIN and how does it help in writing more efficient queries. I will discuss this later, but first let me show you an example of OUTER APPLY also.
cross apply and inner join query plan
The first query in Script #3 selects data from Department table and uses OUTER APPLY to evaluate the Employee table for each record of the Department table.  For those rows for which there is not a match in Employee table, those rows contains NULL values as you can see in case of row 5 and 6. The second query simply uses a LEFT OUTER JOIN between the Department table and the Employee table.  As expected the query returns all rows from Department table; even for those rows for which there is no match in the Employee table.
WHERE E.DepartmentID D.DepartmentID
SELECT FROM Department D LEFT OUTER JOIN Employee E ON D.DepartmentID 
outer apply and left outer join result set
Even though the above two queries return the same information, the execution plan is a bit different. Although cost wise there is not much difference, the query with the OUTER APPLY uses a Compute Scalar operator (which has an estimated operator cost of 0.0000103 or almost 0% of total query cost) before Nested Loops operator to evaluate and produce the columns of Employee table.
outer apply and left outer join query plan
Now comes the time to see where the APPLY operator is really required. In Script #4, I am creating a table-valued function which accepts DepartmentID as its parameter and returns all the employees who belong to this department. The next query selects data from Department table and uses CROSS APPLY to join with the function we created.  It passes the DepartmentID for each row from the outer table expression (in our case Department table) and evaluates the function for each row similar to a correlated subquery. The next query uses the OUTER APPLY in place of CROSS APPLY and hence unlike CROSS APPLY which returned only correlated data, the OUTER APPLY returns non-correlated data as well, placing NULLs into the missing columns.
Script #4 - APPLY with table-valued function
IF EXISTS (SELECT FROM sys.objects WHERE OBJECT_ID OBJECT_ID(N'[fn_GetAllEmployeeOfADepartment]') AND type IN (N'IF')) BEGIN
dbo.fn_GetAllEmployeeOfADepartment END GO CREATE FUNCTION dbo.fn_GetAllEmployeeOfADepartment(@DeptID AS INTRETURNS TABLE
WHERE E.DepartmentID @DeptID
) GO SELECT FROM Department D CROSS APPLY dbo.fn_GetAllEmployeeOfADepartment(D.DepartmentID) GO SELECT FROM Department D OUTER APPLY dbo.fn_GetAllEmployeeOfADepartment(D.DepartmentID)
apply with table valued function result set
So now if you are wondering, can we use a simple join in place of the above queries? Then the answer is NO, if you replace CROSS/OUTER APPLY in the above queries with INNER JOIN/LEFT OUTER JOIN, specify ON clause (something as 1=1) and run the query, you will get "The multi-part identifier "D.DepartmentID" could not be bound."  error. This is because with JOINs the execution context of outer query is different from the execution context of the function (or a derived table), and you can not bind a value/variable from the outer query to the function as a parameter.  Hence the APPLY operator is required for such queries.
So in summary the APPLY operator is required when you have to use table-valued function in the query, but it can also be used with an inline SELECT statements.
Now let me show you another query with a Dynamic Management Function (DMF). Script #5 returns all the currently executing user queries except for the queries being executed by the current session. As you can see the script below, the sys.dm_exec_requests dynamic management view is being CROSS APPLY'ed with the sys.dm_exec_sql_text dynamic management function which accepts a "plan handle" for the query and the same "plan handle" is being passed from the left/outer expression to the function to work and to return the data.
Script #5 - APPLY with Dynamic Management Function (DMF)
USE master
SELECT DB_NAME(database_idAS [Database][text] AS [Query]  FROM sys.dm_exec_requests r CROSS APPLY sys.dm_exec_sql_text(r.plan_handlest WHERE session_Id 50           -- Consider spids for users only, no system spids. AND session_Id NOT IN (@@SPID)  
-- Don't include request from current spid.
Please note the [text] column in the above query returns the all queries submitted in a batch; if you want to see only active (currently executing) query you can use statement_start_offset and statement_end_offset columns to trim the active part of the query. Tim Ford has provided a very good explanation of usage of these columns in his How to isolate the current running commands in SQL Server tip.
As I told you before there are certain scenarios where a query with APPLY operator performs better than a query with regular joins but I am not going to delve into much details rather here are some articles which discuss this topic in greater details.
Please note, APPLY operator is not an ANSI operator but rather an extension of SQL Server T-SQL (available in SQL Server 2005 and above), so if you plan to port your database to some other DBMS take this into your considerations.

Popular Posts