European Windows 2012 Hosting BLOG

BLOG about Windows 2012 Hosting and SQL 2012 Hosting - Dedicated to European Windows Hosting Customer

European SQL Hosting - Amsterdam :: How to Truncate Log File in SQL Server 2008/2008R2/2012

clock February 22, 2013 07:17 by author Scott

When we create a new database inside the SQL Server, it is typical that SQL Server creates two physical files in the Operating System: one with .MDF Extension, and another with .LDF Extension.
* .MDF is called as Primary Data File.
* .LDF is called as Transactional Log file.
 
Sometimes, it looks impossible to shrink the Truncated Log file. The following code always shrinks the Truncated Log File to minimum size possible.

USE DatabaseName

GO
ALTER DATABASE [DBName] SET RECOVERY SIMPLE WITH NO_WAIT

DBCC SHRINKFILE([DBName_log], 1)

ALTER DATABASE [DBName] SET RECOVERY FULL WITH NO_WAIT

GO

 

 



European Visual Studio LightSwitch Hosting :: Forms Authentication in LightSwitch

clock February 21, 2013 06:40 by author Scott

There are two types of Authentications available in LightSwitch applications.

  1. Forms Authentication
  2. Windows Authentication

In this article, I will discuss the first one, Forms Authentication.

Forms Authentication:

Forms Authentication means that a username/password is prompted for when the application opens and these values are checked against the database.
 
This works nicely for clients running across the Internet that are not on a Windows domain. I'll show you both but first let's choose Forms Authentication.

Setting up LightSwitch Solution:

Create a LightSwitch Desktop Application. And create a table called Person as shown in the following picture.

Design a screen for the created table Person. We will discuss the design in future articles

Select the List and Details Screen from the Screen Template list [No: 1] for getting the Person details as well as the Details of the selected Person.

Give a name for the Screen [No: 2].

Select the Date for the Screen Data ComboBox [No: 3] which is retrieved from the table we have created.

Enabling Authentication:

By default the authentication is not enabled. To enable the authentication just select the project's properties from the Project Menu.
Select Access Control [No: 1] Menu Tab.

From the Access Control Tab, select the Use Forms Authentication option [No: 2].

Yes. We have just done that with enabling the forms authentication. The next step is to create the Permissions for Roles for Users. We will discuss Roles and Users later; first we will see how Permission works.

Adding Permissions:

Permissions are nothing but allowing the user to do something on our LightSwitch application.

In the above picture we have created three permissions Read, Write and Delete. The SecurityAdministration is the default permission provided by the LightSwitch application to create Users and Roles.

We have selected the four permissions to True on Debug mode. So that user can Read, Write and Delete or Add users and roles in Debug mode.
Set the permissions we have created to take effect.

Setting Permissions in Methods:

To add a method to check for the permission, we need to open the table we have created. For this application we have created Person table; just open it.

On the right top corner you can see the Write Code menu. Just click it. You will get a collection of items. From that select Access Control group as shown in this above figure.

Click on People_CanInsert, _CanDelete and _CanRead. These are the default methods provided by LightSwitch.

Now write a line of code as shown in the figure given below.

At the time of creation of Permissions an enumeration of Permissions will be created for us with the Permissions we created in the Access Control Tab in Project Properties.

The first method _CanDelete will check for the currently logged in user whether the user has the permission to do a Delete. If it is true then the application will allow the user to delete.

As like _CanDelete method, the other methods will act.

Now we are ready to Press F5. Just press it.

In Debug mode the LightSwitch application will not show the Login form; it will ask for the user name and password only on Published Application. But permissions that we have selected as granted will havew an affect in Debug mode. Let's see in Action.

As we have enabled all the Permissions the Buttons for Add Edit and Delete are enabled here.

Now deselect the CanDelete permission in Project Properties to check whether it is preventing the user to delete the data.

Here we have deselected the CanDelete Permission. Let's see in Code and Screen.

We can see that the result is false because the Permission CanDelete is deselected.

As the CanDelete Permissions is deselected, the Delete button is disabled in the application.

We have just enabled Permissions in Debug mode but we need to create the user and their roles and permissions for that roles created.

In Part II we will discuss about how to add Users Roles and Permissions for the Roles.



European SSRS Hosting - Amsterdam :: Fixing - Your browser does not support scripts or has been configured not to allow scripts.

clock February 15, 2013 07:51 by author Scott

Sometimes our clients complained about this issue when they run a report in the Report Manager:

Your browser does not support scripts or has been configured not to allow scripts.

So, in today post and hope this help for all of you who facing this problem too. What is the solution??

- go to (from its menu) Tools/Internet Options
- go to Security tab
- select Trusted zone from the list of available zones
- click Sites button
- in new window provide the URL of the Reports Manager (e.g. http://MACHINE_NAME/*); if you are running Reports Manager not on the default port, provide the port number as well

Then, it works like a charm. :)

 



European SQL Hosting - Amsterdam :: How to Fix Cannot resolve the collation conflict between "..." and "..." in the equal to operation

clock February 14, 2013 05:12 by author Scott

Sometimes you’ll get this error message when queries your database:

"Cannot resolve the collation conflict between "SQL_Latin1_General_CP1_CI_AS" and "Latin1_General_CI_AI" in the equal to operation"

So, how to fix this error?


Just use the following syntax to collate on the fly when joining up tables with different collations. I integrate system so have to do this allot.

select * from [Product] p join [category] c
on 
c.[Name] collate SQL_Latin1_General_CP1_CI_AS
=
p.[Name] collate SQL_Latin1_General_CP1_CI_AS

Hope this help.



European Cloud Hosting - Amsterdam :: What is Cloud Hosting? The Benefit Using Cloud on Your Environment

clock February 13, 2013 09:25 by author Scott

It seems that everyone is talking cloud these days. Have you been thinking about making the switch over to cloud hosting but not really sure if it’s the right fit for your business? Cloud hosting offers a wide variety of benefits. Let’s take a look at some of the most important ones…

Scaling Resources

The ability to scale resources (such as memory, CPU, and SAN storage space) is an important piece of cloud hosting.

Scalability gives your business the ability to handle major traffic fluctuations during busy seasons. Think a florist website on Valentine’s Day, a swimwear store in the height of summer or a toy store around the holidays for example. Scaling resources ensure full availability, even with an influx of visitors because of a special promotion, ad campaign or media mention. Cloud hosting also allows for resources to be scaled down when the busy season slows – very important to controlling costs.

And scalability in the cloud just got better! The new Microsoft Windows 2012 server management capability offers increased scaling abilities for cloud customers, with up to 64 CPUs and 1TB memory per cloud server.  Click to learn more about our Windows Cloud Servers.

Better Security

Security and safety of data is another benefit that the cloud brings to you. The dedicated environment of the cloud boasts increased security features. With shared hosting, what happens on other customers’ sites can affect your own. Cloud hosting removes this risk and its dedicated environment boasts increased security features. A cloud set-up is loaded with anti-malware software which protects from viruses and other digital threats. Also, all of your business data and records are backed up in case of failure. You get real piece of mind with the security cloud hosting offers.

Automatic Failover

When you’re in the cloud, hardware issues are no longer a major headache. With cloud hosting, you have access to automatic failover. If any piece of hardware should run into issues, your server will automatically migrate to another hardware node in our highly-redundant cluster. This helps to create a fully available, down-time free environment.

Live migration

Live migration is another big benefit driven by cloud hosting and helps to improve the end user experience. Live migration allows hardware maintenance and updates or upgrading of the servers without causing any downtime.

 



European SQL 2012 Hosting - Amsterdam :; Why a Session With sp_readrequest Takes so Long to Execute

clock February 12, 2013 05:36 by author Scott

While applying, Long Running Sessions Detection Job on a production server, we start receiving alert that a session is taking more then 3 minutes. But what actually this session was doing. Here is the alert report.

SP ID

Stored Procedure Call

DB Name

Executing Since

58

msdb.dbo.sp_readrequest;1�

msdb

3 min

sp_readrequest is a system stored procedure, which basically reads a message request from the the queue and returns its  contents.

This process can remain active for a time we have configured for parameter DatabaseMailExeMinimumLifeTime, at the time of database mail profile configuration. 600 seconds is the default value for this external mail process. According to BOL DatabaseMailExeMinimumLifeTime is the The minimum amount of time, in seconds, that the external mail process remains active.

This can be changed, at the time of mail profile configuration or you can just use update query to change this time.

UPDATE msdb.dbo.sysmail_configuration
SET
paramvalue = 60 --60 Seconds
WHERE paramname = 'DatabaseMailExeMinimumLifeTime'


We have changed this to 60 seconds to resolve our problem.

 



European ASP.NET Ajax Hosting :: JQuery Ajax not work in IE, but work in Firefox, Chome, Safari. Why??

clock February 8, 2013 08:03 by author Scott

You may have come across the situation where you are making a jQuery AJAX GET request to update part of your page with changing content, but it never updates or just returns blank content.

What it actually comes down to is that the ‘Get’ call is working correctly, but Internet Explorer (IE) caches the response, so you never get the updated results from the server.

What are the options:

1. Use POST rather than GET – although this is not semantically correct, you should only POST when modifying contents on the server, technically there really is not much difference.

2. (Preferred) You can disable caching globally using $.ajaxSetup(), for example:

$.ajax

Setup({ cache: false });


This appends a timestamp to the querystring when making the request. To turn cache off for a particular $.ajax() call, set cache: false on it locally, like this:

 $.ajax({
cache: false,
//other options...
});



Smarter Mail Issue :: Smartermail Attachments not working

clock February 6, 2013 07:59 by author Scott

Sometimes, permissions can become corrupt for SmarterMail which prevents attachments from being displayed in webmail.  This article goes over fixing this. 

Steps
An error like this can sometimes be seen when trying to get the attachment:
XML Parsing Error: no element found

Before you proceed, do a SmarterMail self-diagnostic (also known as checkup page), this will help you identify the problem.

  1. Log in to SmarterMail as the system administrator.
  2. Click Settings (top left).
  3. Under Activation click SmarterMail Self Diagnostic.

This will launch a separate window showing you the problem.

For permissions:

  • C:\Program Files (x86)\SmarterTools\SmarterMail\MRS\MailProcessing
  • C:\Program Files (x86)\SmarterTools\SmarterMail\MRS\App_Data

Please apply the permission (read and write) to your user, network service, IUSR. Then, please run again Self Diagnostic test.

 



European SQL 2012 Hosting - Amsterdam :: How to Transfer Data Two SQL Server Databases

clock January 30, 2013 07:42 by author Scott

Sometimes we need to transfer database data and objects from one server to another. In this article I represent a tool which can transfer data and database objects like tables and stored procedure scripts to another SQL Server database. Practically I've transferred database objects from SQL Server 2005 to SQL Server 2012. Then I thought it could help other stuff which would face these kinds of problems!  

Background 

Once I had a requirement to transfer data between two online databases. I saw many tools on the internet for transferring data between two SQL Server databases, but I decided to develop this kind of a tool because I believed that if you write some code you learn something new... 

In this article we learn the following points: 

- How to connect to a SQL Server database. 
- How to generate a Table and SP script programmatically. 
- How to copy data between two tables using Bulk Copy.
- How to insert data in an Identity column manually.  

Using the code 

The name of this project is DataTransfer. I use Windows Forms Application in Visual Studio 2008 to developed this project. I separate three sections to design the UI. See the image below pointing out the five main functionalities:



1. Point-1 in section-1: this section is used to take the source server information. Source server means a SQL Server database that has some data and objects needed to transfer.

2. Point-2 in section -2: this section is used to take the destination server information. Destination server means a SQL Server database where we place transferable objects and data.

3. Point-3 in sections 1 and 2: used to set SQL Server connection authentication because most of the time when we connect a database, we use SQL Server Authentication or Windows Authentication.

When we move to transfer an object or some data the first time we build the source and destination server connection considering the above point.   

Connection string build code:

public void BuildConnectionString()
{
    if (chkSAuthentication.Checked)
    {
        strSrc = "Data Source=" + txtSServerName.Text + ";Initial Catalog=" +
          txtSDatabase.Text + ";User Id=" + txtSLogin.Text +
          ";Password=" + txtSPassword.Text;
    }
    else
    {
        strSrc = "Data Source=" + txtSServerName.Text +
          ";Initial Catalog=" + txtSDatabase.Text + ";Integrated Security=True";
    }

    if (chkDAuthentication.Checked)
    {
        strDst = "Data Source=" + txtDServerName.Text + ";Initial Catalog=" +
          txtDDatabase.Text + ";User Id=" + txtDLogin.Text +
          ";Password=" + txtDPassword.Text;
    }
    else
    {
        strDst = "Data Source=" + txtDServerName.Text +
          ";Initial Catalog=" + txtDDatabase.Text +
          ";Integrated Security=True";
    }
}


1. Point-4 in section-3: used to transfer behavior. There are two main options: one for Table object and another for Store Procedure object. When we select a table, the text box prepares to get transferable table name, and when we select a SP then the text box prepares to get the SP name.

2. Point-5 in section-3: used mainly when we try to transfer a Table object from a database to another database. When we transfer a Table then there are two transferable options: a table script and table data. This application creates a table script from a source database and executes this script to create a table in a destination database. 

To create a table script we use a T-SQL statement. When we execute this T-SQL statement it returns a datatable with some rows. Those rows have a total table script with Identity, and a Primary key script.

Table script generate code: 

public string GetTableScript(string TableName, string ConnectionString)
{
    string Script = "";

    string Sql = "declare @table varchar(100)" + Environment.NewLine +
    "set @table = '" + TableName + "' " + Environment.NewLine +
        //"-- set table name here" +
    "declare @sql table(s varchar(1000), id int identity)" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- create statement" +
    "insert into  @sql(s) values ('create table [' + @table + '] (')" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- column list" +
    "insert into @sql(s)" + Environment.NewLine +
    "select " + Environment.NewLine +
    "    '  ['+column_name+'] ' + " + Environment.NewLine +
    "    data_type + coalesce('('+cast(character_maximum_length as varchar)+')','') +
    ' ' +" + Environment.NewLine +
    "    case when exists ( " + Environment.NewLine +
    "        select id from syscolumns" + Environment.NewLine +
    "        where object_name(id)=@table" + Environment.NewLine +
    "        and name=column_name" + Environment.NewLine +
    "        and columnproperty(id,name,'IsIdentity') = 1 " + Environment.NewLine +
    "    ) then" + Environment.NewLine +
    "        'IDENTITY(' + " + Environment.NewLine +
    "        cast(ident_seed(@table) as varchar) + ',' + " + Environment.NewLine +
    "        cast(ident_incr(@table) as varchar) + ')'" + Environment.NewLine +
    "    else ''" + Environment.NewLine +
    "   end + ' ' +" + Environment.NewLine +
    "    ( case when IS_NULLABLE = 'No' then 'NOT ' else '' end ) + 'NULL ' + " + Environment.NewLine +
    "    coalesce('DEFAULT '+COLUMN_DEFAULT,'') + ','" + Environment.NewLine +
    " " + Environment.NewLine +
    " from information_schema.columns where table_name = @table" + Environment.NewLine +
    " order by ordinal_position" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- primary key" +
    "declare @pkname varchar(100)" + Environment.NewLine +
    "select @pkname = constraint_name from information_schema.table_constraints" + Environment.NewLine +
    "where table_name = @table and constraint_type='PRIMARY KEY'" + Environment.NewLine +
    " " + Environment.NewLine +
    "if ( @pkname is not null ) begin" + Environment.NewLine +
    "    insert into @sql(s) values('  PRIMARY KEY (')" + Environment.NewLine +
    "    insert into @sql(s)" + Environment.NewLine +
    "        select '   ['+COLUMN_NAME+'],' from information_schema.key_column_usage" +
Environment.NewLine +
    "        where constraint_name = @pkname" + Environment.NewLine +
    "        order by ordinal_position" + Environment.NewLine +
        //"    -- remove trailing comma" +
    "    update @sql set s=left(s,len(s)-1) where id=@@identity" + Environment.NewLine +
    "    insert into @sql(s) values ('  )')" + Environment.NewLine +
    "end" + Environment.NewLine +
    "else begin" + Environment.NewLine +
        //"    -- remove trailing comma" +
    "    update @sql set s=left(s,len(s)-1) where id=@@identity" + Environment.NewLine +
    "end" + Environment.NewLine +
    " " + Environment.NewLine +
    "-- closing bracket" + Environment.NewLine +
    "insert into @sql(s) values( ')' )" + Environment.NewLine +
    " " + Environment.NewLine +
        //"-- result!" +
    "select s from @sql order by id";
    DataTable dt = GetTableData(Sql, ConnectionString);
    foreach (DataRow row in dt.Rows)
    {
        Script += row[0].ToString() + Environment.NewLine;
    }

    return Script;
}

To create a SP script we use a SQL Server database built-in Sp name "sp_helptext", it has a parameter to get the SP name.

SP script generate code:

public string GetSPScript(string SPName, string ConnectionString)
{
    string Script = "";

    string Sql = "sp_helptext '" + SPName + "'";

    DataTable dt = GetTableData(Sql, ConnectionString);
    foreach (DataRow row in dt.Rows)
    {
        Script += row[0].ToString() + Environment.NewLine;
    }

    return Script;
}

When we get both scripts from the source database then simply execute the destination database for transferring both objects to another database.

Now we transfer data between two servers. In this project we use two options to transfer data: Bulk copy method, or generate an insert statements according to source table and data then execute those statements in the destination server. 

Bulk data copy code:

void TransferData()
{
    try
    {
        DataTable dataTable = new Utility().GetTableData("Select * From " + txtTableName.Text, strSrc);

        SqlBulkCopy bulkCopy = new SqlBulkCopy(strDst, SqlBulkCopyOptions.TableLock)
        {
            DestinationTableName = txtTableName.Text,
            BatchSize = 100000,
            BulkCopyTimeout = 360
        };
        bulkCopy.WriteToServer(dataTable);

        MessageBox.Show("Data Transfer Succesfull.");
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message);
    }
}

Copy data using Insert statements code:  

void TransferDataWithTableScript()
{
    try
    {       

        DataTable dataTable = new Utility().GetTableData("Select * From " + txtTableName.Text, strSrc);

        if (!string.IsNullOrEmpty(new Utility().GetIdentityColumn(txtTableName.Text, strSrc)))
        {
            string InsertSQL = "";
            InsertSQL += "SET IDENTITY_INSERT [" + txtTableName.Text + "] ON " + Environment.NewLine;

            string ColumnSQL = "";
            foreach (DataColumn column in dataTable.Columns)
            {
                ColumnSQL += column.ColumnName + ",";
            }
            ColumnSQL = ColumnSQL.Substring(0, ColumnSQL.Length - 1);

            foreach (DataRow row in dataTable.Rows)
            {
                string ColumnValueL = "";
                foreach (DataColumn column in dataTable.Columns)
                {
                    ColumnValueL += "'" + row[column.ColumnName].ToString().Replace("''", "'") + "',";
                }
                ColumnValueL = ColumnValueL.Substring(0, ColumnValueL.Length - 1);

                InsertSQL += "Insert Into " + txtTableName.Text +
                  " (" + ColumnSQL + ") Values(" +
                  ColumnValueL + ")" + Environment.NewLine;
            }

            InsertSQL += "SET IDENTITY_INSERT [" + txtTableName.Text + "] OFF " + Environment.NewLine;

            new Utility().ExecuteQuery(InsertSQL, strDst);
        }       

        MessageBox.Show("Data Transfer Succesfull.");
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message);
    }
}

We need to use second data copy option because when a table has an Identity column and we try to copy data from another table, the new table generates a new ID for the identity column and not use the existing identity column data. For this situation we use this option to copy data. Also we use an extra statement to insert an identity column value before we execute an insert statement in the destination server, and after we execute the insert statement, we need to execute another statement. 

Identity column code:  

SET IDENTITY_INSERT [" + txtTableName.Text + "] ON // before execute insert statement
SET IDENTITY_INSERT [" + txtTableName.Text + "] OFF // after execute insert statement

 



European SQL 2012 Hosting - Amsterdam :: Tabular Models vs PowerPivot Models SQL 2012

clock January 23, 2013 06:32 by author Scott

In SQL Server 2012, there is a new data model, called tabular, that is part of the new feature called the Business Intelligence Semantic Model (BISM).  BISM also includes the multidimensional model (formally called the UDM).

The Tabular model is based on concepts like tables and relationships that are familiar to anyone who has a relational database background, making it easier to use than the multidimensional model.  The tabular model is a server mode you choose when installing Analysis Services.

The tabular model is an enhancement of the current PowerPivot data model experience, both of which use the Vertipaq engine.  When opening a PowerPivot for SharePoint workbook, a SSAS cube is created behind the scenes, which is why PowerPivot for SharePoint requires SSAS to be installed.

So if tabular models and PowerPivot models use the same Analysis Services engine, why are tabular models necessary when we already have PowerPivot?

There are four things that tabular models offer that PowerPivot models does not:

  1. Scalability - PowerPivot has a 2 GB limit for the size of the Excel file and does not support partitions, but tabular model have no limit and support partitions.  Tabular models also support DirectQuery
  2. Manageability – There are a lot of tools you can use with the tabular model that you can’t use with PowerPivot: SSMS, AMO, AMOMD, XMLA, Deployment Wizard, AMO for PowerShell, and Integration Services
  3. Securability – Tabular models can use row security and dynamic security, neither of which PowerPivot supports, only Excel workbook file security
  4. Professional development toolchain - Tabular models live in the Visual Studio shell.  Thus, they enjoy all the shell services such as integrated source control, msbuild integration, and Team Build integration.  PowerPivot lives in the Excel environment, thus it is limited to the extensibility model provided in Excel (which doesn’t include source control or build configuration).  Also, because tabular models live in the VS environment, build and deployment can be naturally separated


So Analysis Services can now be installed in one of three server modes: Multidimensional and Data Mining (default), PowerPivot for SharePoint, and Tabular.

More info:

When to choose tabular models over PowerPivot models

Comparing Analysis Services and PowerPivot

Feature by Server Mode or Solution Type (SSAS)

 



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in