European Windows 2012 Hosting BLOG

BLOG about Windows 2012 Hosting and SQL 2012 Hosting - Dedicated to European Windows Hosting Customer

European SQL 2012 Hosting - Amsterdam :; Why a Session With sp_readrequest Takes so Long to Execute

clock February 12, 2013 05:36 by author Scott

While applying, Long Running Sessions Detection Job on a production server, we start receiving alert that a session is taking more then 3 minutes. But what actually this session was doing. Here is the alert report.

SP ID

Stored Procedure Call

DB Name

Executing Since

58

msdb.dbo.sp_readrequest;1�

msdb

3 min

sp_readrequest is a system stored procedure, which basically reads a message request from the the queue and returns its  contents.

This process can remain active for a time we have configured for parameter DatabaseMailExeMinimumLifeTime, at the time of database mail profile configuration. 600 seconds is the default value for this external mail process. According to BOL DatabaseMailExeMinimumLifeTime is the The minimum amount of time, in seconds, that the external mail process remains active.

This can be changed, at the time of mail profile configuration or you can just use update query to change this time.

UPDATE msdb.dbo.sysmail_configuration
SET
paramvalue = 60 --60 Seconds
WHERE paramname = 'DatabaseMailExeMinimumLifeTime'


We have changed this to 60 seconds to resolve our problem.

 



European ASP.NET Ajax Hosting :: JQuery Ajax not work in IE, but work in Firefox, Chome, Safari. Why??

clock February 8, 2013 08:03 by author Scott

You may have come across the situation where you are making a jQuery AJAX GET request to update part of your page with changing content, but it never updates or just returns blank content.

What it actually comes down to is that the ‘Get’ call is working correctly, but Internet Explorer (IE) caches the response, so you never get the updated results from the server.

What are the options:

1. Use POST rather than GET – although this is not semantically correct, you should only POST when modifying contents on the server, technically there really is not much difference.

2. (Preferred) You can disable caching globally using $.ajaxSetup(), for example:

$.ajax

Setup({ cache: false });


This appends a timestamp to the querystring when making the request. To turn cache off for a particular $.ajax() call, set cache: false on it locally, like this:

 $.ajax({
cache: false,
//other options...
});



Smarter Mail Issue :: Smartermail Attachments not working

clock February 6, 2013 07:59 by author Scott

Sometimes, permissions can become corrupt for SmarterMail which prevents attachments from being displayed in webmail.  This article goes over fixing this. 

Steps
An error like this can sometimes be seen when trying to get the attachment:
XML Parsing Error: no element found

Before you proceed, do a SmarterMail self-diagnostic (also known as checkup page), this will help you identify the problem.

  1. Log in to SmarterMail as the system administrator.
  2. Click Settings (top left).
  3. Under Activation click SmarterMail Self Diagnostic.

This will launch a separate window showing you the problem.

For permissions:

  • C:\Program Files (x86)\SmarterTools\SmarterMail\MRS\MailProcessing
  • C:\Program Files (x86)\SmarterTools\SmarterMail\MRS\App_Data

Please apply the permission (read and write) to your user, network service, IUSR. Then, please run again Self Diagnostic test.

 



European SQL 2012 Hosting - Amsterdam :: How to Transfer Data Two SQL Server Databases

clock January 30, 2013 07:42 by author Scott

Sometimes we need to transfer database data and objects from one server to another. In this article I represent a tool which can transfer data and database objects like tables and stored procedure scripts to another SQL Server database. Practically I've transferred database objects from SQL Server 2005 to SQL Server 2012. Then I thought it could help other stuff which would face these kinds of problems!  

Background 

Once I had a requirement to transfer data between two online databases. I saw many tools on the internet for transferring data between two SQL Server databases, but I decided to develop this kind of a tool because I believed that if you write some code you learn something new... 

In this article we learn the following points: 

- How to connect to a SQL Server database. 
- How to generate a Table and SP script programmatically. 
- How to copy data between two tables using Bulk Copy.
- How to insert data in an Identity column manually.  

Using the code 

The name of this project is DataTransfer. I use Windows Forms Application in Visual Studio 2008 to developed this project. I separate three sections to design the UI. See the image below pointing out the five main functionalities:



1. Point-1 in section-1: this section is used to take the source server information. Source server means a SQL Server database that has some data and objects needed to transfer.

2. Point-2 in section -2: this section is used to take the destination server information. Destination server means a SQL Server database where we place transferable objects and data.

3. Point-3 in sections 1 and 2: used to set SQL Server connection authentication because most of the time when we connect a database, we use SQL Server Authentication or Windows Authentication.

When we move to transfer an object or some data the first time we build the source and destination server connection considering the above point.   

Connection string build code:

public void BuildConnectionString()
{
    if (chkSAuthentication.Checked)
    {
        strSrc = "Data Source=" + txtSServerName.Text + ";Initial Catalog=" +
          txtSDatabase.Text + ";User Id=" + txtSLogin.Text +
          ";Password=" + txtSPassword.Text;
    }
    else
    {
        strSrc = "Data Source=" + txtSServerName.Text +
          ";Initial Catalog=" + txtSDatabase.Text + ";Integrated Security=True";
    }

    if (chkDAuthentication.Checked)
    {
        strDst = "Data Source=" + txtDServerName.Text + ";Initial Catalog=" +
          txtDDatabase.Text + ";User Id=" + txtDLogin.Text +
          ";Password=" + txtDPassword.Text;
    }
    else
    {
        strDst = "Data Source=" + txtDServerName.Text +
          ";Initial Catalog=" + txtDDatabase.Text +
          ";Integrated Security=True";
    }
}


1. Point-4 in section-3: used to transfer behavior. There are two main options: one for Table object and another for Store Procedure object. When we select a table, the text box prepares to get transferable table name, and when we select a SP then the text box prepares to get the SP name.

2. Point-5 in section-3: used mainly when we try to transfer a Table object from a database to another database. When we transfer a Table then there are two transferable options: a table script and table data. This application creates a table script from a source database and executes this script to create a table in a destination database. 

To create a table script we use a T-SQL statement. When we execute this T-SQL statement it returns a datatable with some rows. Those rows have a total table script with Identity, and a Primary key script.

Table script generate code: 

public string GetTableScript(string TableName, string ConnectionString)
{
    string Script = "";

    string Sql = "declare @table varchar(100)" + Environment.NewLine +
    "set @table = '" + TableName + "' " + Environment.NewLine +
        //"-- set table name here" +
    "declare @sql table(s varchar(1000), id int identity)" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- create statement" +
    "insert into  @sql(s) values ('create table [' + @table + '] (')" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- column list" +
    "insert into @sql(s)" + Environment.NewLine +
    "select " + Environment.NewLine +
    "    '  ['+column_name+'] ' + " + Environment.NewLine +
    "    data_type + coalesce('('+cast(character_maximum_length as varchar)+')','') +
    ' ' +" + Environment.NewLine +
    "    case when exists ( " + Environment.NewLine +
    "        select id from syscolumns" + Environment.NewLine +
    "        where object_name(id)=@table" + Environment.NewLine +
    "        and name=column_name" + Environment.NewLine +
    "        and columnproperty(id,name,'IsIdentity') = 1 " + Environment.NewLine +
    "    ) then" + Environment.NewLine +
    "        'IDENTITY(' + " + Environment.NewLine +
    "        cast(ident_seed(@table) as varchar) + ',' + " + Environment.NewLine +
    "        cast(ident_incr(@table) as varchar) + ')'" + Environment.NewLine +
    "    else ''" + Environment.NewLine +
    "   end + ' ' +" + Environment.NewLine +
    "    ( case when IS_NULLABLE = 'No' then 'NOT ' else '' end ) + 'NULL ' + " + Environment.NewLine +
    "    coalesce('DEFAULT '+COLUMN_DEFAULT,'') + ','" + Environment.NewLine +
    " " + Environment.NewLine +
    " from information_schema.columns where table_name = @table" + Environment.NewLine +
    " order by ordinal_position" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- primary key" +
    "declare @pkname varchar(100)" + Environment.NewLine +
    "select @pkname = constraint_name from information_schema.table_constraints" + Environment.NewLine +
    "where table_name = @table and constraint_type='PRIMARY KEY'" + Environment.NewLine +
    " " + Environment.NewLine +
    "if ( @pkname is not null ) begin" + Environment.NewLine +
    "    insert into @sql(s) values('  PRIMARY KEY (')" + Environment.NewLine +
    "    insert into @sql(s)" + Environment.NewLine +
    "        select '   ['+COLUMN_NAME+'],' from information_schema.key_column_usage" +
Environment.NewLine +
    "        where constraint_name = @pkname" + Environment.NewLine +
    "        order by ordinal_position" + Environment.NewLine +
        //"    -- remove trailing comma" +
    "    update @sql set s=left(s,len(s)-1) where id=@@identity" + Environment.NewLine +
    "    insert into @sql(s) values ('  )')" + Environment.NewLine +
    "end" + Environment.NewLine +
    "else begin" + Environment.NewLine +
        //"    -- remove trailing comma" +
    "    update @sql set s=left(s,len(s)-1) where id=@@identity" + Environment.NewLine +
    "end" + Environment.NewLine +
    " " + Environment.NewLine +
    "-- closing bracket" + Environment.NewLine +
    "insert into @sql(s) values( ')' )" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- result!" +
    "select s from @sql order by id";
    DataTable dt = GetTableData(Sql, ConnectionString);
    foreach (DataRow row in dt.Rows)
    {
        Script += row[0].ToString() + Environment.NewLine;
    }

    return Script;
}

To create a SP script we use a SQL Server database built-in Sp name "sp_helptext", it has a parameter to get the SP name.

SP script generate code:

public string GetSPScript(string SPName, string ConnectionString)
{
    string Script = "";

    string Sql = "sp_helptext '" + SPName + "'";

    DataTable dt = GetTableData(Sql, ConnectionString);
    foreach (DataRow row in dt.Rows)
    {
        Script += row[0].ToString() + Environment.NewLine;
    }

    return Script;
}

When we get both scripts from the source database then simply execute the destination database for transferring both objects to another database.

Now we transfer data between two servers. In this project we use two options to transfer data: Bulk copy method, or generate an insert statements according to source table and data then execute those statements in the destination server. 

Bulk data copy code:

void TransferData()
{
    try
    {
        DataTable dataTable = new Utility().GetTableData("Select * From " + txtTableName.Text, strSrc);

        SqlBulkCopy bulkCopy = new SqlBulkCopy(strDst, SqlBulkCopyOptions.TableLock)
        {
            DestinationTableName = txtTableName.Text,
            BatchSize = 100000,
            BulkCopyTimeout = 360
        };
        bulkCopy.WriteToServer(dataTable);

        MessageBox.Show("Data Transfer Succesfull.");
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message);
    }
}

Copy data using Insert statements code:  

void TransferDataWithTableScript()
{
    try
    {       

        DataTable dataTable = new Utility().GetTableData("Select * From " + txtTableName.Text, strSrc);

        if (!string.IsNullOrEmpty(new Utility().GetIdentityColumn(txtTableName.Text, strSrc)))
        {
            string InsertSQL = "";
            InsertSQL += "SET IDENTITY_INSERT [" + txtTableName.Text + "] ON " + Environment.NewLine;

            string ColumnSQL = "";
            foreach (DataColumn column in dataTable.Columns)
            {
                ColumnSQL += column.ColumnName + ",";
            }
            ColumnSQL = ColumnSQL.Substring(0, ColumnSQL.Length - 1);

            foreach (DataRow row in dataTable.Rows)
            {
                string ColumnValueL = "";
                foreach (DataColumn column in dataTable.Columns)
                {
                    ColumnValueL += "'" + row[column.ColumnName].ToString().Replace("''", "'") + "',";
                }
                ColumnValueL = ColumnValueL.Substring(0, ColumnValueL.Length - 1);

                InsertSQL += "Insert Into " + txtTableName.Text +
                  " (" + ColumnSQL + ") Values(" +
                  ColumnValueL + ")" + Environment.NewLine;
            }

            InsertSQL += "SET IDENTITY_INSERT [" + txtTableName.Text + "] OFF " + Environment.NewLine;

            new Utility().ExecuteQuery(InsertSQL, strDst);
        }       

        MessageBox.Show("Data Transfer Succesfull.");
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message);
    }
}

We need to use second data copy option because when a table has an Identity column and we try to copy data from another table, the new table generates a new ID for the identity column and not use the existing identity column data. For this situation we use this option to copy data. Also we use an extra statement to insert an identity column value before we execute an insert statement in the destination server, and after we execute the insert statement, we need to execute another statement. 

Identity column code:  

SET IDENTITY_INSERT [" + txtTableName.Text + "] ON // before execute insert statement
SET IDENTITY_INSERT [" + txtTableName.Text + "] OFF // after execute insert statement

 



European SQL 2012 Hosting - Amsterdam :: Tabular Models vs PowerPivot Models SQL 2012

clock January 23, 2013 06:32 by author Scott

In SQL Server 2012, there is a new data model, called tabular, that is part of the new feature called the Business Intelligence Semantic Model (BISM).  BISM also includes the multidimensional model (formally called the UDM).

The Tabular model is based on concepts like tables and relationships that are familiar to anyone who has a relational database background, making it easier to use than the multidimensional model.  The tabular model is a server mode you choose when installing Analysis Services.

The tabular model is an enhancement of the current PowerPivot data model experience, both of which use the Vertipaq engine.  When opening a PowerPivot for SharePoint workbook, a SSAS cube is created behind the scenes, which is why PowerPivot for SharePoint requires SSAS to be installed.

So if tabular models and PowerPivot models use the same Analysis Services engine, why are tabular models necessary when we already have PowerPivot?

There are four things that tabular models offer that PowerPivot models does not:

  1. Scalability - PowerPivot has a 2 GB limit for the size of the Excel file and does not support partitions, but tabular model have no limit and support partitions.  Tabular models also support DirectQuery
  2. Manageability – There are a lot of tools you can use with the tabular model that you can’t use with PowerPivot: SSMS, AMO, AMOMD, XMLA, Deployment Wizard, AMO for PowerShell, and Integration Services
  3. Securability – Tabular models can use row security and dynamic security, neither of which PowerPivot supports, only Excel workbook file security
  4. Professional development toolchain - Tabular models live in the Visual Studio shell.  Thus, they enjoy all the shell services such as integrated source control, msbuild integration, and Team Build integration.  PowerPivot lives in the Excel environment, thus it is limited to the extensibility model provided in Excel (which doesn’t include source control or build configuration).  Also, because tabular models live in the VS environment, build and deployment can be naturally separated


So Analysis Services can now be installed in one of three server modes: Multidimensional and Data Mining (default), PowerPivot for SharePoint, and Tabular.

More info:

When to choose tabular models over PowerPivot models

Comparing Analysis Services and PowerPivot

Feature by Server Mode or Solution Type (SSAS)

 



European SQL 2012 Hosting - Amsterdam :: SQL Server 2012 Integration Services with HostForLIFE.eu

clock December 24, 2012 05:51 by author Scott

SQL Server 2012 Integration Services introduces an innovative approach to deploying SSIS projects, known as Project Deployment Model. This is the default and recommended deployment technique, due to a number of benefits it delivers (such as, the ability to centralize management of package properties across deployed projects, as well as monitor and log package execution performance and progress). However, even though the new model has become the recommended and default configuration for new packages created using SQL Server Data Tools, the traditional, package-based methodology remains available and supported. More importantly, in some scenarios, it might be considered a more viable option, since it allows for separation of SQL Server Database Engine and SQL Server Integration Services roles, yielding performance benefits and facilitating a distributed extraction, transformation, and loading (ETL) activities. Project Deployment Model, on the other hand, requires SSIS to be collocated on a SQL Server 2012 instance, due to its dependency on SSIS catalog. We will discuss its characteristics and walk through its implementation in this post.

Reporting Services 2008 for Internet deployment

The deployment method available from a SQL Server Data Tools-based project is directly dependent on the model employed during its creation (it is also possible to switch between two models after a project is created by applying a conversion process, which is invoked from the Project menu in SQL Server Data Tools). Effectively, in order to use the traditional package deployment mechanism, you will need to first ensure that your project is based on the package deployment model, which you can easily identify by checking its label in the Solution Explorer window.

In addition, you will also have to modify default project properties. To access them, right-click on the node representing the project in the Solution Explorer window and select the Properties entry from its context sensitive menu. In the resulting dialog box, switch to the Deployment section, where you will find three settings displayed in the grid on the right hand side:

AllowConfigurationChanges – a Boolean value (i.e. True, which is the default, or False) that determines whether it will be possible to choose package configurations during its deployment and assign values of properties or variables they reference.

CreateDeploymentUtility – a Boolean value (True or False, which is the default) that indicates whether initiating the project build will result in the creation of its Deployment Utility.

DeploymentOutputPath – a path that points to the file system folder where the Deployment Utility will be created. The path is relative to the location where project files reside (and set, by default, to bin\Deployment).

Set the value of the CreateDeploymentUtility property to True and modify AllowConfigurationChanges according to your requirements (e.g. set it to False if your project does not contain any package configurations or you want to prevent their modifications during deployment). Next, start the build process (using the Build item in the top level main menu of SQL Server Data Tools), which will populate the output folder (designated by DeploymentOutputPath parameter) with the .dtsx package file (or files, depending on the number of packages in your project), an XML-formatted Deployment Manifest file (whose name is constructed by concatenating the project name and .SSISDeploymentManifestsuffix) and, potentially, a number of other, project related files (representing, for example, custom components or package configurations).

The subsequent step in the deployment process involves copying the entire content of the Deployment Output folder to a target server and double-clicking on the Deployment Manifest file at its destination. This action will automatically launch the Package Installation Wizard. After its first, informational page, you will be prompted to choose between File system and SQL Server deployments.

The first of these options creates an XML-formatted file (with extension .dtsx) in an arbitrarily chosen folder. If the project contains configuration files (and the AllowConfigurationChanges project property was set to True when you generated the build), then you will be given an option to modify values of properties included in their content. At the end of this procedure, the corresponding .dtsConfigfiles will be added to the target folder.

The second option, labeled SQL Server deployment in the Package Installation Wizard relies on a SQL Server instance as the package store. Once you select it, you will be prompted for the name of the server hosting the SQL Server Database Engine, an appropriate authentication method, and a path where the package should be stored. If you want to organize your packages into a custom folder hierarchy, you will need to pre-create it by connecting to the Integration Services component using SQL Server Management Studio. In case your package contains additional files (such as package configurations), you will also be given an opportunity to designate their location (by default, the wizard points to Program Files\Microsoft SQL Server\110\DTS\Packagesfolder).

In either case, you can decide whether you want to validate packages following their installation (which will append the Packages Validation page to the Package Installation Wizard, allowing you to identify any issues encountered during deployment). In addition, when using SQL Server deployment, you have an option to set a package protection level (resulting in assignment of ServerStorage value to the ProtectionLevel package property). When deploying to file system, this capability is not available, forcing you to resort (depending on the deployment target) to either NTFS permissions or SQL Server database roles for securing access to your packages and sensitive information they might contain.

Just as in earlier versions of SQL Server, local SQL Service Integration Services 11.0 instance (implemented as the MSDTSServer110 service) offers the ability to manage packages stored in the MSDB database and file system, providing you with additional benefits (such as monitoring functionality), which we will discuss in more detail in our upcoming articles. In the case of MSDB storage, this is accomplished by following the SQL Server deployment process we just described and is reflected by entries appearing under the MSDB subfolder of the Stored Packages folder of Integration Services node when viewed in SQL Server 2012 Management Studio. In the same Storage Packages folder, you will also find the File System subfolder containing file system-based packages that have been identified by the SSIS service in the local file system. By default, the service automatically enumerates packages located in Program Files\Microsoft SQL Server\110\DTS\Packages directory, however, it is possible to alter this location by editing the Program Files\Microsoft SQL Server\110\DTS\Binn\MsDtsSrvr.ini.xml file and modifying the content of its StoragePath XML element. Incidentally, the same file controls other characteristics of MSDTSServer110 service, such as, package execution behavior in scenarios where the service fails or stops (by default, the execution is halted) or location of target SSIS instances (defined using the ServerName XMLelement).

While the Package Installation Wizard is straightforward to use, it is not well suited for deploying multiple packages. This shortcoming can be remedied by taking advantage of the DTUtil.exe command line utility, which in addition to its versatility (including support for package deployment and encryption), can also be conveniently incorporated into batch files. Its syntax takes the form DTUtil /option [value] [/option [value]] …, pairing option names with the values associated with them. For example, /SQL, /FILE, or /DTS options designating the package storage type (SSIS Package Store, File System, and MSDB database, respectively) can be combined with a value that specifies package location (as a relative or absolute path). By including COPY, MOVE, DELETE, or EXISTS options (with a value identifying package location) you can effectively copy, move, delete, or verify the existence of packages across all package stores.

In conclusion, the traditional package deployment methods available in earlier versions of SQL Server Integration Services are still supported and fully functional in the current release. However, in most scenarios, you should consider migrating your environment to the latest Project Deployment Model due to a wide range of advantages it delivers.

 



European Cloud Hosting Service with HostForLIFE.eu

clock December 20, 2012 10:03 by author Scott

As cloud technology has matured, what was once an IT paradigm shift has triggered a business paradigm shift. The flexibility, immediacy, scalability and affordability of virtualization technologies have become not just an IT advantage, but a real competitive advantage for businesses. What are needed now are cloud solutions that are built to be business-friendly.

While the cloud can’t do everything – at least not yet – there are plenty of things that HostForLife.eu Cloud Hosting Solutions can do. And they do them very well for all kinds of business needs. Cloud Servers and Shared Cloud Hosting are more than just rehashes of antiquated virtualization techniques and models. These are the cloud solutions that businesses and their IT administrators have been waiting for.

Whether you are running a mission-critical business application, storing mass amounts of data or delivering an online gaming solution to millions of users, you expect the delivery of service to be seamless. Our cloud services provide you the flawless execution you demand from your hosted infrastructure.

Reduce Your Risk through our Availability and Recovery services. We create Cloud Hosting solutions using certified and tested configurations that are optimized for high availability. We insert recovery into every solution we deploy, providing complete IT resilience. Our recovery solutions protect your data assets.

Reduce Your Cost by leveraging the economies of our people, processes and systems. By sharing our infrastructure, expertise and innovations, we help our customers drastically reduce the costs of building an off-site DR solution or highly-available solution in house.

Accelerate Your Delivery by getting instant access to our production-ready, enterprise-class infrastructure. Our industry-leading Customer Portal is an information-rich dashboard that lets you maintain control of your Cloud environment. Our automated provisioning system allows us to deploy scalable solutions that are highly secure. We help you migrate to the Cloud, adhere to BC/DR requirements, and achieve Cloud optimization faster and more efficiently.

About HostForLife.eu

HostForLIFE was established to cater to an underserved market in the hosting industry; web hosting for customers who want excellent service. This is why HostForLIFE continues to prosper throughout the web hosting industry’s maturation process.

HostForLife.eu is Microsoft No #1 Recommended Windows and ASP.NET Hosting in European Continent. Our service is ranked the highest top #1 spot in several European countries, such as: Germany, Italy, Netherlands, France, Belgium, United Kingdom, Sweden, Finland, Switzerland and many top European countries. Click here for more information

As a leading small to mid-sized business web hosting provider, we strive to offer the most technologically advanced hosting solutions available to our customers across the world. Security, reliability, and performance are at the core of our hosting operations to ensure each site and/or application hosted on our servers is highly secured and performs at optimum level. Unlike other web hosting companies, we do not overload our servers.

We believe in our customers — customers are at the core of what we do! And as such, all of our R&D efforts are done with voice of the customer in mind; our support team is courteous and friendly and will do whatever it takes to assist and resolve your support tickets



European World-Class Windows and ASP.NET Web Hosting Leader - HostForLIFE.eu

clock December 18, 2012 07:06 by author Scott

Fantastic and Excellent Support has made HostForLife.eu the Windows and ASP.NET Hosting service leader in European region. HostForLife.eu delivers enterprise-level hosting services to businesses of all sizes and kinds in European region and around the world. HostForLife.eu started its business in 2006 and since then, they have grown to serve more than 10,000 customers in European region. HostForLife.eu integrates the industry's best technologies for each customer's specific need and delivers it as a service via the company's commitment to excellent support. HostForLife.eu core products include Shared Hosting, Reseller Hosting, Cloud Computing Service, SharePoint Hosting and Dedicated Server hosting.

HostForLife.eu service is No #1 Top Recommended Windows and ASP.NET Hosting Service in European continent. Their services is ranked the highest top #1 spot in several European countries, such as: Germany, Italy, Netherlands, France, Belgium, United Kingdom, Sweden, Finland, Switzerland and many top European countries. For more information, please refer to http://www.microsoft.com/web/hosting/HostingProvider/Details/953.

HostForLife.eu has a very strong commitment to introduce their Windows and ASP.NET hosting service to the worldwide market. HostForLife.eu starts to target market in United States, Middle East and Asia/Australia in 2010 and by the end of 2013, HostForLife.eu will be the one-stop Windows and ASP.NET Hosting Solution for every ASP.NET enthusiast and developer.

HostForLife.eu leverages the best-in-class connectivity and technology to innovate industry-leading, fully automated solutions that empower enterprises with complete access, control, security, and scalability. With this insightful strategy and our peerless technical execution, HostForLife.eu has created the truly virtual data center—and made traditional hosting and managed/unmanaged services obsolete.

HostForLIFE.eu currently operates data center located in Amsterdam (Netherlands), offering complete redundancy in power, HVAC, fire suppression, network connectivity, and security. With over 53,000 sq ft of raised floor between the two facilities, HostForLife has an offering to fit any need. The datacenter facility sits atop multiple power grids driven by TXU electric, with PowerWare UPS battery backup power and dual diesel generators onsite. Our HVAC systems are condenser units by Data Aire to provide redundancy in cooling coupled with nine managed backbone providers.

HostForLife.eu does operate a data center located in Washington D.C (United States) too and this data center is best fits to customers who are targeting US market. Starting on Jan 2013, HostForLife.eu will operate a new data centre facility located in Singapore (Asia).

With three data centers that are located in different region, HostForLife.eu commits to provide service to all the customers worldwide. They hope they can achieve the best Windows and ASP.NET Hosting Awards in the World by the end of 2013.

About HostForLIFE.eu

HostForLife.eu is Microsoft No #1 Recommended Windows and ASP.NET Hosting in European Continent. Their service is ranked the highest top #1 spot in several European countries, such as: Germany, Italy, Netherlands, France, Belgium, United Kingdom, Sweden, Finland, Switzerland and many top European countries.

Our number one goal is constant uptime. Our data center uses cutting edge technology, processes, and equipment. We have one of the best up time reputations in the industry.

Our second goal is providing excellent customer service. Our technical management structure is headed by professionals who have been in the industry since its inception. We have customers from around the globe, spread across every continent. We serve the hosting needs of the business and professional, government and nonprofit, entertainment and personal use market segments.



European DNN 7 Hosting with HostForLIFE.eu - Amsterdam Server

clock December 18, 2012 06:08 by author Scott

DotNetNuke 7 is coming with new features and more productive. It is also claimed more advanced development framework to date. It comes with support of active directory and ShrePoint Lists.

Following are new features added

Installing Takes Just A Few Clicks 
With the updated look and fill, DNN 7 installation completes in few steps with simplified install process. Even it is quick, it allows you to configure advanced features.

Optimized Control Panel Experience
DotNetNuke 7 comes with new control panel. Each menu have been updated to offer more inituitive experience. End Users can personalize the menu by creating their own bookmarks within menu. The Modules, Pages and Users menu items provide quick access to common features in those areas.

Share Content Across Multiple Sites
DotNetNuke 7 allows you sharing same module on multiple portals/sites within same DotNetNuke instace. It will make easy to share same HTML content accross site. 

Drag And Drop Is Better Than Ever
You can now drag modules between panes and from the control panel to a pane while in edit mode.  Drag and Drop allows page designers to quickly and easily add content and arrange it on a page however they like.  

Enhanced SharePoint Support and Active Directory Simplifies User Authentication

SharePoint List support and Active Directory are added to Enterprise DNN 7. User can login their Active Directory Credentials.
DotNetNuke 7.0 extends your SharePoint investment by adding support for popular SharePoint Lists in our Microsoft SharePoint Connector.

You can get all the new features above as low as €2.45/month at HostForLIFE.eu.



European Visual Studio 2010 Hosting - Amsterdam :: Easy to Deploy Package Using Visual Studio 2010

clock December 7, 2012 07:23 by author Scott

Deploying a web project with all its correct dependencies is not a trivial task. Some of the assets which need to be considered during deployment are:

  • Web Content (.aspx, .ascx, images, xml files, PDBs, Binaries etc)
  • IIS Settings (Directory browsing, Error pages, Default Documents etc)
  • Databases that the web project uses
  • GAC Assemblies and COM components which the web project depends upon
  • Registry Settings that may be used within the web project
  • Security Certificates
  • App Pools

In an enterprise environment a web application with all of its dependencies needs to move across various environments before being finally being deployed to a production server.  A typical set of transition servers are development, testing/QA, staging/pre-production and production.  Also on the production environment there are web farms where these webs need to be replicated.  Today doing all these things is more or less a manual process and involves a tons of documentation that both developers and server admins have to deal with.  Even with all the documentation the steps are certainly very much prone to errors.

To aid all these scenarios we are introducing the concept of  a "Web Package". Web Package is an atomic, transparent, self describing unit representing your web which can be easily hydrated onto any IIS Web server to reproduce your web.  VS 2010 uses MSDeploy  to create the web package from your web application.

In today's post I will be primarily focusing on creating a web package from VS 2010 which has IIS Settings as well as web content.

Step 1: Configure your Web Application Project (WAP) to use IIS Settings

For this discussion we have BlogEngine.Web downloaded from codeplex and converted it into a WAP.  Then this project was opened in VS 2010  and the VS10 migration wizard moved the project into VS10 format.  Thanks to the multi-targeting  features in VS 2010 which can support .NET versions 2.0 till 4.0; hence it is up to you which Framework version you want to run your web against. 

Step 2: Configure IIS Settings in IIS Manager

Most IIS 7 web applications use IIS integrated pipeline which is configured with "Default App Pool" of IIS.  Blog Engine .web does not use integrated mode and will throw an error shown below if made to run under "Default App Pool".

To get rid of this error I changed the App Pool of this application to "Classic App Pool" and then the application runs great as shown below:

App Pool mapping is just one of the IIS setting which your app may use, there are various other IIS Settings which you can configure using IIS Manager (e.g. Default document, Error pages etc etc); all of these settings are relevant based on your application scenario... The good news is that VS 10 & MSDeploy will auto detect all the changes you make to the default IIS settings and pick it up for deployment...

Essentially, at the end of this step you should have your web application up and running with all the IIS settings configured in IIS Manager. 

Step 3: Configure Package Settings

In VS 2010 we have introduced one additional property page for WAPs called "Publish" as shown below:

Let us look at various properties of the this tab to understand how it works:

Configuration Aware Tab: Note that the Publish tab is build configuration aware:

The Publish tab is made configuration aware as deployment settings tend to change from environment to environment; for e.g. many a times developers want to deploy their “Debug” configuration on a Test Server and include PDBs as part of this deployment. When the same web is deployed in “Release” configuration on a production server the deployment may exclude PDBs. 

Items to Package/Publish – This section will help you decide what type of content you would really like to package/deploy.

- Types of Files: By default this option is set to "Only files needed to run this application" .  This is usually sufficient for your deployment as it includes all the files from your project except source code, project files and other crud files not required to be deployed...  But apart from that there are two additional options available as shown below...

"All files in this project" and "All files in this project folder" options are very similar to what Publish WAP options were in VS 2008

- Exclude Files from App_Data folder – “App_Data” folder is a special ASP.NET folder where many developers like to put their SQL Express DBs (.mdf/.ldf files), XML files and other content which they consider Data. In many situations on production web server a full version of SQL Server is available and using SQL Express is not all that relevant. In such scenario (and for the corresponding build configuration e.g “Release” ) a user can check the “Exclude Files from App_Data”.



- Exclude Generated Debug Symbols – It is important to understand that generation of debug symbols is different from deployment of the same. This check box will tell VS 10 whether you would like to package/deploy the already generated Debug Symbols.

Package Items



-
IIS Settings  - Checking this checkbox informs VS10 that you are ready to take all of your IIS Settings configured for your application in IIS Manager as a part of your web package.  I am glad to tell you that IIS 5.1, IIS 6 as well as IIS 7 environments are supported as part of this feature hence whether you are working on XP, Win2K, Win2K3, Vista or Win2K8 you should have no issue with packaging IIS Settings...  

These setting includes the "App Pool mapping" your web is configured to run against (e.g. "Classic App Pool" mapping discussed in Step 2)

- Additional Settings -   The items in this grid are advanced properties.  It is still good to know about these coz it impacts what will be included in your package.  Most of the properties in this grid are related to the entire server and not just to your application so you should use them very carefully. 

Currently VS10 only displays "Application Pool Settings" but behind the scene it is possible to configure VS10 to support packaging root web.config, machine config , security certificates, ACLs etc…

Package Settings

-
Create MSDeploy Package as a ZIP file - This checkbox allows you to decide whether you would like to create your web package as a .zip file or as a folder structure. If you are concerned about the size and are moving the web package around very often then I can see you using .zip format for the package; on the other hand if you care to compare two packages using diff commands (either of source control or independently) then I can see you using the folder format.

-
Package Location - This is an important and required property as it defines the path at which Visual Studio will place your web package. If you choose to change this path make sure that you have write access to the location. Do note that the Package Location is modified based on whether you choose to create the web package as a .ZIP file or vs a folder structure.

-
Destination IIS Application Path/Name - This property allows you to give IIS Application name that you will use at the destination Web Server.

-
Destination Application Physical Path - One of the most important information which is embedded inside the web package is the physical location where the package should be installed. This property allows you to pre-specify this embedded information.  You will have an opportunity change both IIS Application Physical Path as well as Application Name at the time of deployment but in this property page you are given an opportunity to choose a default value.

Step 4: Create the "Web Package"

This is the last step in creating the web package and the simplest too...  The idea is that once you configure the above settings creating a package should be easy; in fact even if you do not go to the "Publish" tab we have tried to set smart defaults so that in most normal circumstances creating web package should be just the below two steps:

  • Right Click on your "Project"
  • Click on Package --> Create Package

Once you click on this command you should start getting output messages around your package creation pumped into your output window... 

When you see “Publish Succeeded” as below in the output window then your package is successfully created.

To access the package go to the location specified in the “Package Location” textbox. By default this is in obj/Configuration/Package folder under your project root directory (Configuration here implies Active Configuration like Debug/Release etc).

Note: "Create Package" command creates web package only for Active configuration. By default “Debug” is the active configuration inside Visual Studio. If you would like to change the Active configuration you can do so by using Build --> Configuration Manager.

You can certainly set properties for all available configurations by switching the configuration on top of the “Publish” tab but that action does not change the Active configuration

Finally, you can also automate creation of web packages via your team build environment as everything discussed above is supported via MSBuild Tasks.  In subsequent posts we will get into the details of these areas too...

Hope this helps... 

 



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in