European Windows 2012 Hosting BLOG

BLOG about Windows 2012 Hosting and SQL 2012 Hosting - Dedicated to European Windows Hosting Customer

Europe FREE Windows Hosting - Germany :: Dedicated Cloud Server With HostForLIFE.eu

clock April 17, 2014 11:33 by author Peter

A cloud server hosting company gives opportunity to buy resources that help in customization and also comes with innumerable functionalities that are gained through network in a similar manner that is offered through traditional hosting solution. It is possible to avoid all kinds of hassles that are related to hardware and minimal maintenance is required. For a cloud server, a monthly fee is paid to company.

HostForLIFE.eu is one such company that can deliver performance and dedication to your family of websites with Cloud Server. But before you know whether or not a cloud server is needed, you must know what one is. Cloud server give your site, or family of sites, all the resources necessary to navigate technical issues and keep things running smoothly. Instead of sharing personnel, infrastructure and storage space with a pack of unrelated sites, you have the freedom to control operation and mold the server to meet your needs. This can be beneficial in a number of ways.

A dedicated Cloud server has its own processor, Random Access Memory (RAM) and bandwidth capability. Dedicated Cloud servers allow you to install and run almost any program. They additionally allow other users, whom you have given access, the ability to connect to your dedicated Cloud server and use those same programs at the same time you do. This has made dedicated servers very popular amongst internet gamers. Dedicated cloud servers offer all the same features of regular dedicated hosting servers but they are intended for less serious pursuits.

According to the need of clients, Windows platform such as Windows Server 2008/2012 is rendered if using Visual Basic scripts, Active Server Pages (ASP.NET ) or Microsoft Access or Microsoft SQL database. Other important factor to be focused for rendering web hosting services is the disk space and bandwidth required. HostForLIFE.eu cloud server solution with 8 GB RAM, 2x500GB Storage Space, 5000 GB Bandwidth, 1000 MBps Connection speed with Netherlands Data Center or US Data Center.

Also, dedicated Cloud Server can keep you safe from the hazards of the web. Spyware, viruses, worms, and other nasty sounding names are waiting for an opening, planning an attack on your site that will set you back in more ways than one. If your users sense that a website has inadequate protection, it will make them think hard about returning. Hackers may also try to break in and steal financial information if you use your website for commercial purposes. Keeping the predators at bay means having a knowledge of all the latest security measures and how to implement them. Or you can just sign up for a dedicated server and not worry about it.

Different features of cloud servers 

The most important feature is that you can possibly get true value for money. You need to pay same amount as that of basic shared hosting and can get set of features that is found in a dedicated server. Cloud solution is the only way through which it is possible to save a good sum of money and enjoy different features that is necessary for business.Cloud server comes up with no downtime and this means that you will never experience error page regarding website and it always get loaded in different browsers when customers or viewers want to check. The downtime in this regard does not last for more than a few seconds and it is an amazing feature.

Next advantage is related to adding and removing of servers at any point of time and can instantly scale up solutions. It can be that you may in need of another server, which will help to handle large amount of traffic and once it is settled it is possible to scale down the server. It has turned out to be a useful user for people who are making use of cloud computing.

Benefits of Dedicated Cloud Server With HostForLIFE.eu

Faster Service

For most people what is in short supply is time. On this point a cloud server scores above dedicated servers.

Security

Cloud Server provide the highest level of security just like a dedicated cloud server, complete with customizable firewall protection and secure isolated disk space along with the benefit of sharing the cost of the server with other users yet reaping the benefits of the dedicated cloud server.

Performance

All resources can be regulated in the Cloud Server environment. One customer can not run away with a large share of the resources. Your service will run reliably and predictable.

Control

Unlike a shared server, Cloud Server allows you the control of the server. You can choose the software you want to install, RAM, processor etc.

Scalability

Dedicated cloud servers can consume time because the creation of back up takes time. Cloud servers face immense pressure as the number of clients can balloon and go out of control. But that does not happen in dedicated servers as the numbers are manageable. 



SQL Server 2014 Hosting Netherlands - HostForLIFE.eu :: Date Conversions on SQL Server

clock April 16, 2014 07:54 by author Peter

In my current project I need to query an MS SQL Server database. Unfortunately the dates are stored as a BigInt instead of a proper date datatype. So I had to find out how to do compare the dates with the systemdate, and how to get the system date. To log this for possible later use, as an exception, a blog about SqlServer. To get the system date, you can do:

(SELECT dt=GETDATE()) a

It's maybe my Oracle background, but I would write this like:
(SELECT GETDATE() dt) a

An alternative is:
select CURRENT_TIMESTAMP

I found this at this blog. Contrary to the writer of that blog I would prefer this version, since I found that it works on Oracle too. There are several ways to convert this to a bigint, but the most compact I found is:
( SELECT  YEAR(DT)*10000+MONTH(dt)*100+DAY(dt) sysdateInt
FROM
  -- Test Data
  (SELECT  GETDATE() dt) a ) utl

The way I wrote this, makes it usefull as a subquery or a joined query:

SELECT
  Ent.* ,
  CASE
    WHEN Ent.endDate  IS NOT NULL
    AND Ent.endDate-1 < sysdateInt
    THEN Ent.endDate-1
    ELSE sysdateInt
  END refEndDateEntity ,
  utl.sysdateInt
FROM
  SomeEntity Ent,
  ( SELECT  YEAR(DT)*10000+MONTH(dt)*100+DAY(dt) sysdateInt
FROM
  -- Test Data
  (SELECT  GETDATE() dt) a ) utl;

To convert a bigint to a date, you can do the following:
CONVERT(DATETIME, CONVERT(CHAR(8), ent.endDate))

However, I found that although this works in a select clause, in the where-clause this would run into a "Data Truncation" error. Maybe it is due to the use of SqlDeveloper and thus a JDBC connection to SqlServer, but I'm not so enthousiastic about the error-responses of SqlServer... I assume the error has to do with the fact that it has to do with the fact that SqlServer has to interpret a column-value of a row when it did not already selected it, that is when evaluating wheter to add the row (or not) to the result set. So to make it work I added the construction as a determination value in the select clause of a 1:1 view on the table, and use that view in stead of the table. Then the selected value can be used in the where clause.



SQL Server 2014 Hosting UK - HostForLIFE.eu :: SQL Server 2014 Overview

clock April 9, 2014 19:41 by author Peter

SQL Server 2014 is the next generation of Microsoft’s information platform, with new features that deliver faster performance, expand capabilities in the cloud, and provide powerful business insights.  In this blog posting I want to give you an overview about the various performance related enhancements that are introduced.

Lock Priorities

As you might know, SQL Server gives you in the Enterprise Edition Online operations, or as I call them “Almost Online Operations”. They are almost online, because internally SQL Server still has to acquire some locks, which can lead to blocking situations. For that reason SQL Server 2014 introduces Lock Priorities, where you can control how SQL Server should react, if such a blocking situation occurs.

Buffer Pool Extensions

The idea about Buffer Pool Extensions is very easy: expand the Buffer Pool with a paging file that is stored on very fast storage, like SSD drives. The Buffer Pool Extensions are coming quite handy, if you don’t have the ability to physically add more additional RAM to your database server.

Resource Governor

Resource Governor was introduced first back with SQL Server 2008, but wasn’t really a mature technology, because you had no possibility to govern I/O operations on the storage level, and you also had no chance to limit the size of the Buffer Pool for a specific workload group. With SQL Server 2014 things are changing, because you can now throttle I/O operations. Limiting Buffer Pool usage is still not possible, but hey who knows what comes in SQL Server 2016 .

Clustered ColumnStore Indexes

One of the hottest enhancements in SQL Server 2014 is the introduction of Clustered ColumnStore Indexes , which is an amazingly new way concept how to deal with ColumnStore data in SQL Server. And in addition the Clustered ColumnStore Index can be also changed directly – without using tricks like Partition Switching.

In-Memory OLTP

With In-Memory OLTP Microsoft claims that the performance of your workload can be improved up to 100x. Awesome! Everything is now stored directly in the memory, without touching your physical storage anymore (besides the transaction log, if you want). And in addition In-Memory OLTP is based on so-called Lock Free Data Structures, means locking, blocking, latching, and spinlocking is just gone. Of course, there are side-effects and even limitations with this promising approach…

Delayed Transactions

It doesn’t matter how good the throughput of your workload is, the final barrier and bottleneck is almost every time the transaction log. Because of the Write-Ahead Logging mechanism used by SQL Server, a transaction must be always written physically to the transaction log, before the transaction is committed. When your transaction log is on slow storage, your performance and throughput will suffer. For that reason SQL Server 2014 implements so-called Delayed Transactions.

Cardinality Estimation

Cardinality Estimation  is the most important thing in a relational database, because these estimations are feeded into the Query Optimizer, whose job it is to produce a good-enough execution plan. With SQL Server 2014 Microsoft has rewritten the cardinality estimator completely from scratch to overcome some limitations based on the history of this very important component.



HostForLIFE.eu Proudly Announces Microsoft SQL Server 2014 Hosting

clock April 7, 2014 11:06 by author Peter
HostForLIFE.eu was established to cater to an under served market in the hosting industry; web hosting for customers who want excellent service. HostForLIFE.eu a worldwide provider of hosting has announced the latest release of Microsoft's widely-used SQL relational database management system SQL Server Server 2014. You can take advantage of the powerful SQL Server Server 2014 technology in all Windows Shared Hosting, Windows Reseller Hosting and Windows Cloud Hosting Packages! In addition, SQL Server 2014 Hosting provides customers to build mission-critical applications and Big Data solutions using high-performance, in-memory technology across OLTP, data warehousing, business intelligence and analytics workloads without having to buy expensive add-ons or high-end appliances. 

SQL Server 2014 accelerates reliable, mission critical applications with a new in-memory OLTP engine that can deliver on average 10x, and up to 30x transactional performance gains. For Data Warehousing, the new updatable in-memory column store can query 100x faster than legacy solutions. The first new option is Microsoft SQL Server 2014 Hosting, which is available to customers from today. With the public release just last week of Microsoft’s latest version of their premier database product, HostForLIFE has been quick to respond with updated their shared server configurations.For more information about this new product, please visit http://hostforlife.eu/European-SQL-Server-2014-Hosting

About Us:
HostForLIFE.eu is awarded Top No#1 SPOTLIGHT Recommended Hosting Partner by Microsoft (see http://www.microsoft.com/web/hosting/HostingProvider/Details/953). Our service is ranked the highest top #1 spot in several European countries, such as: Germany, Italy, Netherlands, France, Belgium, United Kingdom, Sweden, Finland, Switzerland and other European countries. Besides this award, we have also won several awards from reputable organizations in the hosting industry and the detail can be found on our official website.


FREE SQL Server 2012 Hosting UK - HostForLIFE.eu :: An Application Error Occurred On The Server Running On SQL Server 2012

clock March 29, 2014 18:42 by author Peter

Recently one of my application website went down. I checked all the basic connectivity troubleshooting on my SQL Server 2012 Hosting and seem everything was looking and working fine. Finally, I found the problem with the browser service but that’s also in running state.

Error from the event viewer:

The quick solution is rebooting the SQL browser (Start –>All programs–>Microsoft SQL server 200X–>Configuration tools –>SQL server configuration Manager) without rebooting SQL service. I searched and found a couple of MS links (KB-2526552 And SQLBrowser Unable to start) but, I did not apply it. I used another way that is also a permanent fix.

Troubleshooting ways and a permanent fix:

For me it’s a named instance and listening a dynamic port and DBAs knows the browser service is mainly for named instance. From the local machine we can connect the server through SSMS by using server name and server name + port number. But, other than local machine you cannot connect the server by using server name. (You can test that by connecting some other server or better install only SSMS on the application server and try to connect it) so I went to the application server and opened a connection string as expected the data source only has the server name. So We changed it from Datasource “from Data Source=Servername\Instance to Data Source= Servername\Instance,port” Ex: Muthu1\SQL1,5432.

Application team made a standard to always include a port number in the connection string block i.e. FQDN. 

A Basic SQL Connectivity checks:

- Check SQL service is running or not and try to connect through SSMS from local and remote

- Check TCP/IP protocol enabled on SQL server configuration manager and find the port number

- Connect using a server+port number from SSMS local and remote

- For firewall block/port not opened you can check through command prompt TELNET server port ex: TELNET server 1433

- Check remote connections are enabled & SQL Browser service is running (For a named instance which is not using FQDN)

- Check you have any alias/DNS name.



European FREE ASP.NET 4.5 Hosting UK - HostForLIFE.eu :: Sending additional form data in multipart uploads with ASP.NET Web API

clock March 20, 2014 06:10 by author Peter

If you've used ASP.NET MVC you'll be used to being able to easily process multipart form data on the server by virtue of the ASP.NET MVC model binder. Unfortunately things are not quite so simple in ASP.NET Web API. Most of the examples I have seen demonstrate how to upload a file but do not cover how to send additional form data with the request. The example below (taken from the Fabrik API Client) demonstrates how to upload a file using HttpClient:

public async Task<IEnumerable<MediaUploadResult>> UploadMedia(int siteId, params UploadMediaCommand[] commands)

{

    Ensure.Argument.NotNull(commands, "commands");

    Ensure.Argument.Is(commands.Length > 0, "You must provide at least one file to upload.");

    var formData = new MultipartFormDataContent();

    foreach (var command in commands)

    {

        formData.Add(new StreamContent(command.FileStream), command.FileName, command.FileName);

    }

    var request = api.CreateRequest(HttpMethod.Post, api.CreateRequestUri(GetMediaPath(siteId)));

    request.Content = formData;

    var response = await api.HttpClient.SendAsync(request).ConfigureAwait(false);

    return await response.Content.ReadAsAsync<IEnumerable<MediaUploadResult>>().ConfigureAwait(false);

}

The UploadMediaCommand passed to this method contain a Stream object that we've obtained from an uploaded file in ASP.NET MVC. As you can see, we loop through each command (file) and add it to the MultipartFormDataContent. This effectively allows us to perform multiple file uploads at once. When making some changes to our API recently I realized we needed a way to correlate the files we uploaded with the MediaUploadResult objects sent back in the response. We therefore needed to send a unique identifier for each file included in the multipart form.

Since the framework doesn't really offer a nice way of adding additional form data to MultiPartFormDataContent, I've created a few extension methods below that you can use to easily send additional data with your file uploads.

/// <summary>

/// Extensions for <see cref="System.Net.Http.MultipartFormDataContent"/>.

/// </summary>

public static class MultiPartFormDataContentExtensions

{      

    public static void Add(this MultipartFormDataContent form, HttpContent content, object formValues)

    {       

        Add(form, content, formValues);

    }

    public static void Add(this MultipartFormDataContent form, HttpContent content, string name, object formValues)

    {

        Add(form, content, formValues, name: name);

    }

     public static void Add(this MultipartFormDataContent form, HttpContent content, string name, string fileName, object formValues)

    {

        Add(form, content, formValues, name: name, fileName: fileName);

    }

    private static void Add(this MultipartFormDataContent form, HttpContent content, object formValues, string name = null, string fileName = null)

    {        

        var header = new ContentDispositionHeaderValue("form-data");

        header.Name = name;

        header.FileName = fileName;

        header.FileNameStar = fileName;

        var headerParameters = new HttpRouteValueDictionary(formValues);

        foreach (var parameter in headerParameters)

       {

            header.Parameters.Add(new NameValueHeaderValue(parameter.Key, parameter.Value.ToString()));

        }

         content.Headers.ContentDisposition = header;

        form.Add(content);

    }

}

With these extensions in place I can now update our API client to do the following:foreach (var command in commands)

{

    formData.Add(

        new StreamContent(command.FileStream),

        command.FileName, command.FileName,

        new {

            CorrelationId = command.CorrelationId,

            PreserveFileName = command.PreserveFileName

        }

    );

}

This sets the content disposition header like so:

Content-Disposition: form-data;

    name=CS_touch_icon.png;

    filename=CS_touch_icon.png;

    filename*=utf-8''CS_touch_icon.png;

    CorrelationId=d4ddd5fb-dc14-4e93-9d87-babfaca42353;

    PreserveFileName=False

On the API, to read we can loop through each file in the upload and access the additional data like so:

foreach (var file in FileData) {

    var contentDisposition = file.Headers.ContentDisposition;

    var correlationId = GetNameHeaderValue(contentDisposition.Parameters, "CorrelationId");

}

Using the following helper method:

private static string GetNameHeaderValue(ICollection<NameValueHeaderValue> headerValues, string name)

{          

    if (headerValues == null)

        return null;

    var nameValueHeader = headerValues.FirstOrDefault(x => x.Name.Equals(name, StringComparison.OrdinalIgnoreCase));

    return nameValueHeader != null ? nameValueHeader.Value : null;

}

In case you were interested below is the updated code we are using to process the uploaded files within ASP.NET MVC:

[HttpPost]

public async Task<ActionResult> Settings(SiteSettingsModel model)

{

    await HandleFiles(new[] {

        Tuple.Create<HttpPostedFileBase, Action<string>>(model.LogoFile, uri => model.LogoUri = uri),

        Tuple.Create<HttpPostedFileBase, Action<string>>(model.IconFile, uri => model.IconUri = uri ),

        Tuple.Create<HttpPostedFileBase, Action<string>>(model.FaviconFile, uri => model.FaviconUri = uri)

    });

    await siteClient.UpdateSiteSettings(Customer.CurrentSite, model);

    return RedirectToAction("settings")

        .AndAlert(AlertType.Success, "Success!", "Your site settings were updated successfully.");

}

private async Task HandleFiles(Tuple<HttpPostedFileBase, Action<string>>[] files)

{

    var uploadRequests = (from file in files

                          where file.Item1.IsValid() // ensures a valid file

                          let correlationId = Guid.NewGuid().ToString()

                          select new

                          {

                              CorrelationId = correlationId,

                              Command = file.Item1.ToUploadMediaCommand(correlationId),

                              OnFileUploaded = file.Item2

                          }).ToList();

    if (uploadRequests.Any())

    {

        var results = await mediaClient.UploadMedia(Customer.CurrentSite,

            uploadRequests.Select(u => u.Command).ToArray());

         foreach (var result in results)

        {

            // find the original request using the correlation id

            var request = uploadRequests.FirstOrDefault(r => r.CorrelationId == result.CorrelationId);

            if (request != null)

            {

                request.OnFileUploaded(result.Uri);

            }

        }

   }

}



SQL Server Hosting France - HostForLIFE.eu :: SQL String concatenation with CONCAT() function

clock March 10, 2014 08:05 by author Peter

We have been using plus sign (+) operator for concatenating string values for years in SQL Server with its limitations (or more precisely, its standard behaviors). The biggest disadvantage with this operator is, resulting NULL when concatenating with NULLs. This can be overcome by different techniques but it needs to be handled. Have a look on below code;

 -- FullName will be NULL for 
 -- all records that have NULL 

 -- for MiddleName 
 SELECT  
   BusinessEntityID 
   , FirstName + ' ' + MiddleName + ' ' + LastName AS FullName 
 FROM Person.Person 
 -- One way of handling it 
 SELECT  
   BusinessEntityID 
   , FirstName + ' ' + ISNULL(MiddleName, '') + ' ' + LastName AS FullName 
 FROM Person.Person 
 -- Another way of handling it 
 SELECT  
   BusinessEntityID 
   , FirstName + ' ' + COALESCE(MiddleName, '') + ' ' + LastName AS FullName 
 FROM Person.Person 

SQL Server 2012 introduced a new function called CONCAT that accepts multiple string values including NULLs. The difference between CONCAT and (+) is, CONCAT substitutes NULLs with empty string, eliminating the need of additional task for handling NULLs. Here is the code.

 SELECT  
   BusinessEntityID 
   , CONCAT(FirstName, ' ', MiddleName, ' ', LastName) AS FullName 
 FROM Person.Person 

If you were unaware, make sure you use CONCAT with next string concatenation for better result. However, remember that CONCAT substitutes NULLs with empty string which is varchar(1), not as varchar(0).



SQL Server 2012 Spain Hosting - HostForLIFE.eu :: Configure a SQL Server Alias for a Named Instance

clock March 5, 2014 05:09 by author Peter

There are plenty of tutorials out there that explain how to configure an MS SQL Server alias. However, since none of them worked for me, I wrote this post so I'll be able to look it up in the future. Here's what finally got it working for me.

My Use Case

In my development team at work, some of our local database instances have different names. Manually adapting the connection string to my current local development machine every single time is not an option for me because it's error-prone (changes might get checked into version control) and outright annoying.

The connection string we're using is defined in our Web.config like this:

<add name="SqlServer" connectionString="server=(local)\FooBarSqlServer;…"   

providerName="System.Data.SqlClient" />

This is the perfect use case for an alias. Basically, an alias maps an arbitrary database name to an actual database server. So I created an alias for FooBarSqlServer, which allows me to use the above (unchanged) connection string to connect to my local (differently named) SQL Server instance. That was when I ran into the trouble motivating me to write this post. The alias simply didn't work: I couldn't use it to connect to the database, neither in our application nor using SQL Server Management Studio.

The Working Solution

I googled around quite a bit and finally found the solution in Microsoft's How to connect to SQL Server by using an earlier version of SQL Server: The section Configure a server alias to use TCP/IP sockets pointed out that I had to look up the specific port number used by the TCP/IP protocol:

Here's how you find the port number that's being used by TCP/IP on your machine:

1) Open the SQL Server Configuration Manager.

2) Expand SQL Server Network Configuration and select Protocols for <INSTANCE_NAME>.

3) Double-click on TCP/IP and make sure Enabled is set to Yes.

4) Remember whether Listen All is set to Yes or No and switch to the IP Addresses tab.

- Now, if Listen All was set to Yes (which it was for me), scroll down to the IPAll section at the very bottom of the window and find the value that's displayed for TCP Dynamic Ports.

- If Listen All was set to No, locate the value of TCP Dynamic Ports for the specific IP address you're looking for.

You'll have to copy this port number into the Port No field when you're configuring your alias. Note: that you'll have to set the Alias Name to the exact value used in your connection string. Also, if you're not using the default SQL Server instance on your development machine (which I am), you'll need to specify its name in the Server field in addition to the server name. In my case, that would be something like YourDirectory\NAMED_SQL_INSTANCE. Remember to also define the alias for 32-bit clients when your database has both 64-bit and 32-bit clients.



HostForLIFE.eu Proudly Announces ASP.NET MVC 5.1.1 Hosting

clock February 28, 2014 10:47 by author Peter

European Recommended Windows and ASP.NET Spotlight Hosting Partner in Europe, HostForLIFE.eu, has announced the availability of new hosting plans that are optimized for the latest update of the Microsoft ASP.NET MVC 5.1.1 technology.  HostForLIFE.eu - a cheap, constant uptime, excellent customer service, quality and also reliable hosting provider in advanced Windows and ASP.NET technology. They proudly announces the availability of the ASP.NET MVC 5.1.1 hosting in their entire servers environment.

HostForLIFE.eu hosts its servers in top class data centers that is located in Amsterdam to guarantee 99.9% network uptime. All data center feature redundancies in network connectivity, power, HVAC, security, and fire suppression. All hosting plans from HostForLIFE.eu include 24×7 support and 30 days money back guarantee. You can start hosting your ASP.NET MVC 5.1.1 site on their environment from as just low €3.00/month only.

ASP.NET MVC 5.1.1  is the latest update to Microsoft's popular MVC (Model-View-Controller) technology - an established web application framework. MVC enables developers to build dynamic, data-driven web sites. ASP.NET MVC 5.1.1 adds sophisticated features. ASP.NET MVC 5.1.1  gives you a powerful, patterns-based way to build dynamic websites that enables a clean separation of concerns and that gives you full control over markup. For additional information about ASP.NET MVC 5.1.1 Hosting offered by HostForLIFE.eu, please visit http://hostforlife.eu/European-ASPNET-MVC-511-Hosting

About HostForLIFE.eu

HostForLIFE.eu is an European Windows Hosting Provider which focuses on the Windows Platform only. HostForLIFE.eu deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.



Windows Server 2012 R2 Hosting - HostForLIFE.eu :: Learning Windows Server 2012 R2 on Remote Desktop Service

clock February 21, 2014 09:41 by author Peter

Microsoft just released Windows Server 2012 R2 (18 October 2013). In this article, I’m going to describe which changes are available in this release compared to the first Windows 2012 release for the Remote Desktop Services (RDS)/Virtual Desktop Infrastructure (VDI). This topic contains only brief information about Windows Server 2012 R2 Hosting, if you want to be more familiar with Windows Server 2012 R2, you should try HostForLife.eu.

Installation

Logically we need to start the installation of the RDS components. Comparing this with Windows 2013 there are actually not anything changed. The possibility to install RDS using the specific Remote Desktop Services Installation is still available, followed by the two same installation methodologies: a standard deployment to divide the different RDS roles over several servers and the Quick Start deployment where the roles are installed on one server. Using the RDS installation method logically also the question to deploy a virtual machine-based desktop deployment or session-based desktop deployment is still asked. Just as in Windows 2012 it will install a VDI infrastructure the Remote Desktop Virtualization Host or the formerly known as Terminal Server based infrastructure.

It is also still possible to use the role-based installation, also here no changes comparing to Windows Server 2012.

Which is actually new it the possibility to install the RD Connection Broker role on a Domain Controller server. Personally I only would use this for demo environments or really small infrastructures. Also there is possibility to in-place upgrade of a RD Session Host to 2012 RS (from the current 2012 version).

Configuration

Also on this part I can be really quick, there are not any noticeable changes made to the configuration part. It is fully comparable with the configuration possibilities in Windows 2012; I don’t see anything new, also no things are removed. There are also no changes visible to the User Profile Disk feature.

 

Also making changes to the RDWeb Access is still done via the same way (edit the web.config file).

Is there something new?

You could think is there something new within this R2 release for Remote Desktop Services. There are definitely changes/new features, but most they are based on the user experience. Many improvements are difficult to shown in article based review. I will describe the new features, my thoughts about those new features and where possible show the features.

Improved RemoteApp Behaviour

Although this looks like a small step for the user experiences this is in my opinion a big step. The improved behavior is shown at two ways. The first one is when you are dragging the Remote App to another location on the desktop the complete application still will be shown, while previous versions only showed the frame of the application. Secondly a full preview thumbmail is shown when hovering over the app in the taskbar, just as applications directly available on the client. In previous versions only a standard icon of the application was shown. Also there is now full support for transparent windows for RemoteApps and Aero peek, again creating the same view as local applications.

Quick Reconnect

When using the Remote and Desktop Connections (RADC) in Windows Server 2012 it could take a while to reconnect to all RemoteApps. Microsoft claims that this process is improved and is now around 5 seconds to fully reconnect. Also network loss detection has been improved with a more intuitive reconnect phase. Logically this are good improvements for a better user experience.

Codec Improvements

Also the codecs are improved to further reduce the bandwidth for non-video content including the possibility to offload all progressive decode processing to AVC/H.2.64 hardware available in the client. Also DirectX 11 is now supported by extending the ability to virtualize a GPU on a Hyper-V host.

RemoteApp within a Remote Desktop Session

Although this was already available in Windows 2012 itself, it is not known a lot so I thought it was a good idea to mentSion it once more. It is now possible/supported to start a Remote Desktop session and start within this session a RemoteApp (form another server). This options creates the possibility to use the silo concept with Microsoft RDS separating a specific (set of) application(s) on specific servers for performance, incompatibility and/or security reasons.

Dynamically monitor/resolution changes

Within Windows 2012 R2 it is now possible to change the amount of monitors during the RD session or the view of screen (for example on a Surface) the session will automatically resize according the change made without a reconnection. I think this is a big step forward and end users will love this, especially with tablets in mind where the user switches often from horizontal to the vertical view and vice versa.

Support for ClickOnce Applications

Although just a view supplier use this kind of installation technique and is has some challenges within managed user profile environments it a good thing that ClickOnce applications can be offered as a RemoteApp within Windows Servers 2012 R2. Above improvements are mostly based on the user experience. There are also some changes made from a technical or administrative viewpoint.

Online Data Duplication

Probably the biggest announcement for RDS made for Windows Server 2012R2. Online Data Deplucation can achieve a big reduction on storage capacity requirements (Microsoft claims up to 90%) for personal VDI machines. As it looks like that personal VDI machines are the most used deployment technique for VDI infrastructures Microsoft arranges that one of the biggest challenges becomes a smaller challenge.

Shadowing is back

In Windows 2012 Microsoft removed the RD Shadowing option completely, while in Windows 2008 (R2) the functionally was already reduces. Although I see that most customers are now moving to Remote Assistance Microsoft is re-introducing the shadowing functionality again. Microsoft is stating that it is supporting both single as multiple monitors, but it’s unclear to me if the situation is the same as in Windows 2008R2 where it was needed that both the user as the support team should have the same monitor amount. Unfortunate I could not test this is my demo environment, anyone with experiences and would like to share those let me know (so I can add it the article).

Conclusion: To upgrade or not to upgrade

Windows Server 2012R2 Remote Desktop Services brings in my opinion several new features and improvements that are a real added value. But should you upgrade to this R2 release. It depends on your current situation. If you are still on Windows 2008R2 or lower, I think this is a good moment to consider upgrading to the latest platform. If you are still using a 32bit edition the 64bit challenge still apply and should not be forgotten. If you are running Windows Server 2012 it depends on the current end-user satisfaction. If they do not complain about the parts that are now improved I don’t think an upgrade is necessary. A different situation if you are using personal VDI machine and can use the online data duplication feature I think upgrading to Windows Server 2012 R2 is almost a must. Summarized R2 offers several good new features and improvements.



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in