European Windows 2012 Hosting BLOG

BLOG about Windows 2012 Hosting and SQL 2012 Hosting - Dedicated to European Windows Hosting Customer

SQL Server Hosting - HostForLIFE :: SQL Server String Functions: LEN, REPLICATE, SUBSTRING, and CHARINDEX (With Actual Examples)

clock November 18, 2025 07:28 by author Peter

In SQL Server, working with strings is very common, particularly when working with names, emails, or anything that has been concealed.
We'll examine how to use the fundamental SQL Server string functions in this article:

  • SUBSTRING()
  • CHARINDEX()
  • LEN()
  • REPLICATE()

and combine them to mask an email address.

This is a perfect example to understand how these functions behave in real-hfrld scenarios.

Scenario
Suppose we want to store or display masked email addresses.
For example:
[email protected]  →  hf*****@gmail.com

We want to:
Keep the first 2 characters
Replace the next few characters with *

Keep the domain part (@gmail.com)
Let’s see how SQL handles this.

Sample SQL Code
DECLARE @firstname VARCHAR(100),
        @lastname  VARCHAR(100),
        @email     VARCHAR(100);

SET @firstname = 'Peter';
SET @lastname  = 'Scott';
SET @email     = '[email protected]';

SELECT
    @firstname AS FirstName,
    @lastname AS LastName,
    SUBSTRING(@email, 1, 2)                     -- First 2 characters
        + REPLICATE('*', 5)                     -- Mask with 5 stars
        + SUBSTRING(@email, CHARINDEX('@', @email),
          LEN(@email) - CHARINDEX('@', @email) + 1)   -- Extract domain
        AS MaskedEmail;


Understanding the Functions
1. SUBSTRING()

Extracts part of a string.

Example
SUBSTRING('[email protected]', 1, 2)

Output
hf

2. CHARINDEX()
Finds the position of a character inside a string.

CHARINDEX('@', '[email protected]')

Output
4

This tells us the @ symbol starts at character 4.

3. LEN()
Returns the total length of a string.

LEN('[email protected]')

Output
13

4. REPLICATE()

Repeats a character or string multiple times.

REPLICATE('*', 5)

Output
*****

Extracting the Domain
To extract everything from @ onward:

SUBSTRING(@email, CHARINDEX('@', @email), LEN(@email))

Output
@gmail.com

We start at the @, so no need for +1.

Final Output

FirstNameLastNameMaskedEmail
Peter Scott hfl*****@gmail.com

HostForLIFE.eu SQL Server 2022 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.



SQL Server Hosting - HostForLIFE :: Data Synchronization Between Cloud and On-Prem SQL Databases

clock November 13, 2025 09:38 by author Peter

In modern enterprise systems, it’s common to have data distributed across multiple environments — such as on-premises SQL Servers and cloud-hosted databases like Azure SQL Database or Amazon RDS for SQL Server. This hybrid setup provides scalability, performance, and security — but it also introduces a crucial challenge: keeping the data synchronized between both environments.

This article explores why synchronization is needed, various strategies to achieve it, and a technical implementation guide using SQL Server, Azure Data Sync, and custom APIs in ASP.NET Core.

Why Synchronization Is Needed?
Many organizations are in the process of moving from on-prem systems to the cloud. During this migration — or even after partial migration — both environments often remain active.

Challenges in Synchronization
While the concept seems simple — “keep data the same in two databases” — the technical execution can get complex due to:

Latency and Network Constraints: Cloud connectivity may not always be stable.

Conflict Resolution: Changes can occur simultaneously on both sides.

Schema Differences: Column types or names may differ across databases.

Data Volume: Handling millions of rows can lead to performance bottlenecks.

Security and Compliance: Data transfer must be encrypted and compliant.

To handle these challenges, we need a robust synchronization strategy.

Synchronization Strategies
There are several proven strategies to synchronize SQL data between on-prem and cloud databases. Let’s explore them in detail.

1. SQL Server Replication
SQL Server supports various types of replication:

  • Transactional Replication: Real-time updates from on-prem to cloud.
  • Merge Replication: Bi-directional sync, suitable when both systems can update data.
  • Snapshot Replication: Periodic full sync, good for static or reference data.

Pros

  • Natively supported by SQL Server.
  • Configurable conflict resolution.

Cons
Requires setup and maintenance of replication agents.
Not ideal for large schema or high-latency networks.

2. Custom Sync via Change Tracking / Change Data Capture (CDC)
If you need more flexibility, you can build a custom synchronization service using SQL Server features like:

Change Tracking – Lightweight, tracks which rows changed.
Change Data Capture (CDC) – Tracks both old and new values for changed rows.

This allows your application to:
Fetch deltas since the last sync.
Push updates via APIs or direct DB connection.
Handle conflicts in custom logic.

Pros

Full control over sync logic.
Works across any SQL Server versions or environments.
Extensible for custom conflict handling.

Cons
Requires manual implementation.
More code maintenance.

3. API-Based Synchronization (ASP.NET Core)
In some architectures, data synchronization is done via secure REST APIs instead of direct DB-to-DB communication.

Flow

  • On-prem application detects change.
  • It pushes updated data via API to cloud.
  • Cloud acknowledges and stores the record.

This approach is ideal when:

  • Security policies restrict direct DB connections.
  • You want an audit trail and retry mechanism.
  • You need partial data synchronization.

Implementation Example
Let’s go through a step-by-step implementation using SQL Change Tracking and ASP.NET Core APIs.

Step 1: Enable Change Tracking in SQL Server
ALTER DATABASE [MyOnPremDB]
SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 7 DAYS, AUTO_CLEANUP = ON);

ALTER TABLE dbo.Customer
ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = ON);


This will start tracking inserts, updates, and deletes.

Step 2: Create a Sync Log Table
CREATE TABLE SyncLog (
    Id INT IDENTITY PRIMARY KEY,
    TableName NVARCHAR(100),
    LastSyncVersion BIGINT,
    SyncDate DATETIME DEFAULT GETDATE()
);

You’ll store the last synchronized version number here.

Step 3: Fetch Changes Since Last Sync
DECLARE @last_sync BIGINT =
    (SELECT ISNULL(LastSyncVersion, 0) FROM SyncLog WHERE TableName = 'Customer');

DECLARE @current_version BIGINT = CHANGE_TRACKING_CURRENT_VERSION();

SELECT c.CustomerId, c.Name, c.Email, c.LastModified
FROM dbo.Customer AS c
JOIN CHANGETABLE(CHANGES dbo.Customer, @last_sync) AS CT
    ON c.CustomerId = CT.CustomerId;

Step 4: Push Data to Cloud via ASP.NET Core API

API Controller (ASP.NET Core)
[ApiController]
[Route("api/[controller]")]
public class SyncController : ControllerBase
{
    private readonly IConfiguration _config;
    private readonly SqlConnection _connection;

    public SyncController(IConfiguration config)
    {
        _config = config;
        _connection = new SqlConnection(_config.GetConnectionString("CloudDB"));
    }

    [HttpPost("upload")]
    public async Task<IActionResult> UploadChanges([FromBody] List<Customer> customers)
    {
        foreach (var customer in customers)
        {
            using var cmd = new SqlCommand("spUpsertCustomer", _connection)
            {
                CommandType = CommandType.StoredProcedure
            };
            cmd.Parameters.AddWithValue("@CustomerId", customer.CustomerId);
            cmd.Parameters.AddWithValue("@Name", customer.Name);
            cmd.Parameters.AddWithValue("@Email", customer.Email);
            await _connection.OpenAsync();
            await cmd.ExecuteNonQueryAsync();
            await _connection.CloseAsync();
        }

        return Ok(new { Status = "Success", Count = customers.Count });
    }
}

Step 5: Cloud Database Stored Procedure

CREATE PROCEDURE spUpsertCustomer
    @CustomerId INT,
    @Name NVARCHAR(100),
    @Email NVARCHAR(100)
ASBEGIN
    IF EXISTS (SELECT 1 FROM Customer WHERE CustomerId = @CustomerId)
        UPDATE Customer
        SET Name = @Name, Email = @Email
        WHERE CustomerId = @CustomerId;
    ELSE
        INSERT INTO Customer (CustomerId, Name, Email)
        VALUES (@CustomerId, @Name, @Email);
END;

Step 6: Update Sync Log

After every successful sync:
UPDATE SyncLog
SET LastSyncVersion = @current_version, SyncDate = GETDATE()
WHERE TableName = 'Customer';

Conflict Resolution Strategy

If both sides can update data, you need a conflict resolution rule.

Common strategies include:

  • Last Write Wins (LWW): The most recent update overwrites older data.
  • Source Priority: Cloud or on-prem always wins.
  • Merge Policy: Combine field-level changes (e.g., append new comments).

Manual Review: Log conflicting records for admin intervention.

Performance Optimization Tips
Use batching: Send records in small chunks (e.g., 500 at a time).
Compress payloads: Use GZip or Brotli in HTTP requests.
Track only necessary columns: Avoid full-table comparisons.
Run sync off-peak hours: Schedule jobs during low user activity.
Use async APIs: Avoid blocking sync agents.

Security Considerations

Always use HTTPS for API communication.
Encrypt sensitive data (AES, TLS).
Implement API Key / OAuth2 authentication.
Maintain audit trails for sync operations.
Restrict access using firewall and VPN connections.

Testing and Monitoring

Before going live:

  • Validate both sides’ schema and indexes.
  • Test with sample data changes (inserts, updates, deletes).
  • Monitor sync frequency, data latency, and conflict logs.
  • Use SQL Profiler and Application Insights for diagnostics.

Conclusion
Data synchronization between on-prem and cloud SQL databases is critical for hybrid and enterprise applications. By combining SQL Change Tracking, secure APIs, and automated background jobs, you can build a reliable, scalable, and maintainable synchronization pipeline. While tools like Azure Data Sync simplify configuration, custom sync services offer unmatched flexibility — especially when integrating with ASP.NET Core APIs and custom business rules. The right approach depends on your data size, change frequency, and compliance requirements, but with careful design, real-time hybrid data synchronization is absolutely achievable.

HostForLIFE.eu SQL Server 2022 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

 



SQL Server Hosting - HostForLIFE :: Using Linked Servers in SQL Server to Create a Centralized Reporting Database

clock November 11, 2025 06:17 by author Peter

Data is dispersed over several SQL Server instances in many firms; for instance, sales data may be on one server, HR data on another, and inventory data on a third.
If you have to manually export and combine all of the data into Excel or Power BI, analyzing such dispersed data can become a headache. T
hat’s where Linked Servers in SQL Server come in handy.

By using Linked Servers, you can build a Centralized Reporting Database — a single SQL Server instance that can query multiple databases across servers in real time, without the need for constant data imports or duplication.

This guide will teach you:

  • How to design a centralized reporting architecture
  • How to configure Linked Servers step-by-step
  • How to write cross-server queries
  • Optimization and security tips
What Is a Centralized Reporting Database?
One SQL Server database that is used for centralized reporting is one that:
  • uses linked servers to connect to several different databases.
  • combines their data into materialized tables or views.
  • gives dashboards, analytics, and BI tools a single reporting layer.
This architecture makes report production easier, preserves real-time consistency, and helps prevent duplication.

How Linked Servers Work?

A Linked Server allows one SQL Server instance to execute commands against another database — even if it’s running on a different machine or platform.

Once configured, you can:
  • Run queries across servers using four-part naming convention
  • Join remote and local tables seamlessly
  • Fetch or aggregate data on-demand
Example
SELECT *
FROM [CentralDB].[dbo].[Sales]
INNER JOIN [LinkedServer1].[ERP].[dbo].[Customers]
ON Sales.CustomerID = Customers.CustomerID;


Step-by-Step: Setting Up a Linked Server
Step 1: Create a Linked Server

EXEC sp_addlinkedserver
   @server = 'RemoteERP',
   @srvproduct = '',
   @provider = 'SQLNCLI',
   @datasrc = '192.168.1.20';  -- remote SQL Server instance

Step 2: Add Login Mapping
EXEC sp_addlinkedsrvlogin
   @rmtsrvname = 'RemoteERP',
   @useself = 'false',
   @locallogin = NULL,
   @rmtuser = 'sa',
   @rmtpassword = 'YourPassword';


Step 3: Test Connection

EXEC sp_testlinkedserver 'RemoteERP';

If the connection is successful, you’ll get a “Command(s) completed successfully” message.

Step 4: Query Remote Data
Now you can access tables on the remote server like this:
SELECT TOP 10 *
FROM [RemoteERP].[SalesDB].[dbo].[Orders];


Or even join with your local tables:
SELECT L.OrderID, L.TotalAmount, C.CustomerName
FROM [CentralDB].[dbo].[LocalOrders] AS L
INNER JOIN [RemoteERP].[SalesDB].[dbo].[Customers] AS C
ON L.CustomerID = C.CustomerID;


Step 5: Create a Centralized Reporting View
To make reporting easier, you can create views in your central database that aggregate remote data.

Example – A consolidated sales summary
CREATE VIEW vw_AllSales ASSELECT
    S.OrderID,
    S.SaleDate,
    S.Amount,
    'ERP Server' AS Source
FROM [RemoteERP].[SalesDB].[dbo].[Sales] AS S
UNION ALLSELECT
    S.OrderID,
    S.SaleDate,
    S.Amount,
    'CRM Server' AS Source
FROM [LinkedCRM].[CRMDB].[dbo].[Sales] AS S;


Now your reporting tools (like Power BI or SSRS) can query vw_AllSales directly — pulling unified sales data from multiple servers in real time.

Step 6: Automate Data Refresh or ETL (Optional)
If you want to materialize data locally for faster reporting (instead of real-time queries), you can use SQL Agent Jobs to schedule nightly imports:
INSERT INTO [CentralDB].[dbo].[SalesArchive]
SELECT * FROM [RemoteERP].[SalesDB].[dbo].[Sales]
WHERE SaleDate >= DATEADD(DAY, -1, GETDATE());


You can then use this staging table for reports, ensuring performance and reliability even when remote servers are busy.

Optimization Tips

TipDescription
Filter at the remote server Use OPENQUERY() to run remote filtering before data transfer.
Index local staging tables For large data sets, index staging tables used for reporting.
Use incremental loads Don’t pull entire tables — only sync new or updated data.
Use materialized views If supported, create pre-computed summary tables for faster BI.

Example Using OPENQUERY (Better Performance)

SELECT *
FROM OPENQUERY(RemoteERP, 'SELECT CustomerID, SUM(Total) AS Sales FROM Sales GROUP BY CustomerID');

This executes the aggregation on the remote server before returning results — much faster for large datasets.

Security Best Practices
Use dedicated SQL logins for linked servers with read-only permissions.
Never store credentials in plain text — use SQL Authentication mapping.

Limit Data Access and RPC options unless needed:
EXEC sp_serveroption 'RemoteERP', 'rpc out', 'false';
EXEC sp_serveroption 'RemoteERP', 'data access', 'true';


Audit and monitor Linked Server connections via:
SELECT * FROM sys.servers;

Real-World Architecture Example
+-----------------------+
|   SQL Server Central  |  --> Linked Server Views (vw_AllSales, vw_AllCustomers)
|   Reporting Database  |
+-----------------------+
        |         |
        |         +----> RemoteERP (Sales, Billing)
        |
        +--------------> RemoteCRM (Leads, Customers)
        |
        +--------------> RemoteHR (Employees, Attendance)


Your BI tools (e.g., Power BI, Tableau, SSRS) connect only to the Central Reporting DB, which unifies all data sources dynamically.

Troubleshooting Common Linked Server Issues

Error MessageCauseFix
“Cannot initialize data source object” Permissions or provider issue Grant file access / install provider
“Login failed for user” Wrong credentials Check sp_addlinkedsrvlogin
“Query timeout expired” Slow network or huge query Use filters or schedule ETL jobs
“RPC Out not enabled” Cross-server procedure call blocked Run EXEC sp_serveroption 'Server', 'rpc out', 'true';

Conclusion

A Centralized Reporting Database powered by Linked Servers offers a powerful way to:

  • Unify data from multiple systems
  • Simplify reporting and analytics
  • Reduce manual data consolidation
  • Enable near real-time business insights

By combining Linked Servers, scheduled ETL processes, and reporting views, you can build a scalable, secure, and efficient data architecture for your organization — without requiring a full data warehouse immediately.

HostForLIFE.eu SQL Server 2022 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

 



SQL Server Hosting - HostForLIFE :: Effectively Using XML and JSON Data in SQL Server for APIs

clock November 7, 2025 05:57 by author Peter

Backend databases are more than just tables and columns in today's API-driven applications. Complex hierarchical data, which is typically represented as JSON or XML, must be sent and received by APIs.

Because SQL Server natively supports both JSON and XML formats, developers can:

  • Data from API requests can be easily parsed into tables.
  • Provide structured answers without requiring extensive application-layer change.

In order to enhance efficiency, lower code complexity, and streamline your API integration—particularly when utilizing ASP.NET Core or Web API—we'll look at how to effectively integrate JSON and XML inside SQL Server.

2. Why JSON and XML Matter for APIs
Most modern APIs use JSON (JavaScript Object Notation) for data exchange because it’s lightweight and human-readable.
However, many legacy systems and enterprise integrations still rely on XML for structured documents, invoices, and configurations.

SQL Server supports both formats, which means you can:

  • Store JSON/XML data directly in database columns.
  • Parse and query nested data.
  • Return dynamic API responses without looping in C#.

Here’s what it enables in real-world projects:
ASP.NET Core API receives JSON → stored procedure handles JSON directly.
Stored procedure returns JSON → API sends it back without re-serialization.

3.1 Storing JSON Data
You can store JSON as plain text in an NVARCHAR(MAX) column:
CREATE TABLE CustomerOrders (
    OrderId INT PRIMARY KEY,
    CustomerName NVARCHAR(100),
    OrderData NVARCHAR(MAX) -- stores JSON
);


Example JSON data
{"OrderNumber": "ORD123","Items": [
    { "Product": "Keyboard", "Qty": 2, "Price": 450 },
    { "Product": "Mouse", "Qty": 1, "Price": 250 }],"Total": 1150}


Insert JSON into the table
INSERT INTO CustomerOrders (OrderId, CustomerName, OrderData)
VALUES (1, 'Peter',
N'{
  "OrderNumber": "ORD123",
  "Items": [
    {"Product": "Keyboard", "Qty": 2, "Price": 450},
    {"Product": "Mouse", "Qty": 1, "Price": 250}
  ],
  "Total": 1150
}');


3.2 Reading JSON Values
Use JSON_VALUE() to extract a scalar value.
SELECT
  JSON_VALUE(OrderData, '$.OrderNumber') AS OrderNumber,
  JSON_VALUE(OrderData, '$.Total') AS TotalAmount
FROM CustomerOrders;


Output
OrderNumber | TotalAmount
------------|------------
ORD123      | 1150


3.3 Parsing Arrays with OPENJSON
Use OPENJSON() to split array elements into rows.
SELECT
  JSON_VALUE(OrderData, '$.OrderNumber') AS OrderNumber,
  Item.value('Product', 'nvarchar(50)') AS ProductName,
  Item.value('Qty', 'int') AS Quantity,
  Item.value('Price', 'decimal(10,2)') AS UnitPrice
FROM CustomerOrders
CROSS APPLY OPENJSON(OrderData, '$.Items')
WITH (
    Product NVARCHAR(50) '$.Product',
    Qty INT '$.Qty',
    Price DECIMAL(10,2) '$.Price'
) AS Item;


This query expands nested arrays into tabular data — ideal for APIs that send product line items.

3.4 Returning JSON from SQL Server

Instead of letting your ASP.NET Core app serialize the data, you can return it as JSON directly:
SELECT
  OrderId,
  CustomerName,
  OrderData
FROM CustomerOrders
FOR JSON PATH, ROOT('Orders');


Output
{"Orders": [
    {
      "OrderId": 1,
      "CustomerName": "Peter",
      "OrderData": {
        "OrderNumber": "ORD123",
        "Items": [
          { "Product": "Keyboard", "Qty": 2, "Price": 450 },
          { "Product": "Mouse", "Qty": 1, "Price": 250 }
        ],
        "Total": 1150
      }
    }]}

This JSON can be sent directly to your API response — saving time and CPU cycles in .NET.

4. Working with XML in SQL Server
XML support in SQL Server has been around since 2005.
It’s still widely used in enterprise-level systems or integrations (e.g., financial, logistics, or government APIs).

4.1 Storing XML Data
CREATE TABLE VendorInvoices (
    InvoiceId INT PRIMARY KEY,
    VendorName NVARCHAR(100),
    InvoiceData XML
);


Example XML
<Invoice>
  <Number>INV-999</Number>
  <Date>2025-11-05</Date>
  <Items>
    <Item>
      <Name>SSD Drive</Name>
      <Qty>2</Qty>
      <Price>3200</Price>
    </Item>
    <Item>
      <Name>RAM 16GB</Name>
      <Qty>1</Qty>
      <Price>5500</Price>
    </Item>
  </Items>
</Invoice>

Insert XML
INSERT INTO VendorInvoices (InvoiceId, VendorName, InvoiceData)
VALUES (1, 'TechVendor Pvt Ltd',
N'<Invoice>
    <Number>INV-999</Number>
    <Date>2025-11-05</Date>
    <Items>
      <Item><Name>SSD Drive</Name><Qty>2</Qty><Price>3200</Price></Item>
      <Item><Name>RAM 16GB</Name><Qty>1</Qty><Price>5500</Price></Item>
    </Items>
</Invoice>');


4.2 Querying XML with XQuery
You can extract elements using .value() and .nodes().
SELECT
  InvoiceData.value('(/Invoice/Number)[1]', 'nvarchar(50)') AS InvoiceNumber,
  InvoiceData.value('(/Invoice/Date)[1]', 'date') AS InvoiceDate
FROM VendorInvoices;

Parsing nested XML arrays:
SELECT
  I.InvoiceId,
  Items.value('(Name)[1]', 'nvarchar(50)') AS ProductName,
  Items.value('(Qty)[1]', 'int') AS Quantity,
  Items.value('(Price)[1]', 'decimal(10,2)') AS Price
FROM VendorInvoices I
CROSS APPLY InvoiceData.nodes('/Invoice/Items/Item') AS T(Items);


4.3 Returning XML Responses
SELECT
  InvoiceId,
  VendorName,
  InvoiceData
FROM VendorInvoices
FOR XML PATH('Invoice'), ROOT('Invoices');


Output
<Invoices>
  <Invoice>
    <InvoiceId>1</InvoiceId>
    <VendorName>TechVendor Pvt Ltd</VendorName>
    <InvoiceData>
      <Invoice>
        <Number>INV-999</Number>
        ...
      </Invoice>
    </InvoiceData>
  </Invoice>
</Invoices>

5. JSON vs XML — Which to Choose?

FeatureJSONXML
Format Lightweight, human-readable Verbose but structured
Best For Web APIs, REST, mobile apps Legacy systems, SOAP, enterprise
SQL Server Support From 2016+ Since 2005
Parsing Functions OPENJSON, JSON_VALUE, JSON_QUERY .nodes(), .value(), .exist()
Schema Validation No Yes (XSD support)
Performance Faster for APIs Slightly slower due to verbosity

Recommendation

  • For modern APIs → Prefer JSON.

  • For legacy or B2B integrations → Use XML.

  • SQL Server can handle both — even in the same stored procedure.

7. Combining JSON + ASP.NET Core API
Here’s an example of a simple Web API endpoint calling a SQL stored procedure that accepts JSON.

Stored Procedure
CREATE PROCEDURE usp_SaveOrder
@OrderJson NVARCHAR(MAX)
ASBEGIN
INSERT INTO CustomerOrders (CustomerName, OrderData)
SELECT JSON_VALUE(@OrderJson, '$.CustomerName'),
       @OrderJson;
END


ASP.NET Core Controller
[HttpPost("api/orders")]
public async Task<IActionResult> SaveOrder([FromBody] JsonElement orderJson)
{
string jsonData = orderJson.ToString();

using var conn = new SqlConnection(_config.GetConnectionString("Default"));
using var cmd = new SqlCommand("usp_SaveOrder", conn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("@OrderJson", jsonData);

await conn.OpenAsync();
await cmd.ExecuteNonQueryAsync();
return Ok("Order saved successfully");
}

Result
Frontend sends JSON → stored procedure parses it → no mapping issues or downtime.

8. Performance Tips for JSON/XML in SQL Server
Use NVARCHAR(MAX) for JSON columns — it performs better than TEXT or NTEXT.

Validate JSON using ISJSON() before processing
WHERE ISJSON(OrderData) = 1

Create computed columns from JSON for indexing
ALTER TABLE CustomerOrders
ADD OrderNumber AS JSON_VALUE(OrderData, '$.OrderNumber') PERSISTED;
CREATE INDEX IX_OrderNumber ON CustomerOrders(OrderNumber);

For XML, use typed XML with XSD schema for faster querying.

Avoid over-storing massive documents — keep only necessary data portions.

Return compact results using FOR JSON PATH, WITHOUT_ARRAY_WRAPPER when the response is a single object.

9. Real-World Use Case
A logistics company integrated multiple systems (fleet tracking, billing, and customer portal).
They used ASP.NET Core + SQL Server to unify data exchange.

Earlier, the application layer transformed large JSON payloads before inserting them into SQL tables — wasting time.
After moving the parsing logic into SQL stored procedures using OPENJSON,

API response time reduced by 30%.

CPU usage on app servers dropped.

10. Best Practices Summary

AreaBest Practice
Storage Use NVARCHAR(MAX) for JSON, XML type for XML
Validation Always validate JSON/XML before processing
Performance Use computed columns for indexing JSON fields
Security Sanitize input data before dynamic SQL usage
API Integration Let SQL handle JSON serialization using FOR JSON
Maintainability Keep JSON structures consistent with DTO models

11. Summary and Conclusion

Integration complexity decreased.

SQL Server’s built-in JSON and XML features let you build cleaner, faster APIs with less code.

By using:

  • OPENJSON() and FOR JSON for modern REST APIs
  • .nodes() and FOR XML for structured enterprise data you can directly work with hierarchical data inside SQL Server without extra transformations in C# or middleware.

This approach ensures:

  • Faster API responses
  • Reduced serialization overhead
  • Cleaner, maintainable backend logic

In short
If your APIs and database speak the same language — JSON or XML — your system becomes leaner, faster, and easier to evolve.

HostForLIFE.eu SQL Server 2022 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.



SQL Server Hosting - HostForLIFE :: Creating Stored Procedures with High Performance in SQL Server

clock November 3, 2025 07:33 by author Peter

When we talk about backend performance, stored procedures play a major role in SQL Server. They are precompiled, reusable, and secure but if not written properly, they can become slow and inefficient. In this article, we’ll learn how to write high-performance stored procedures with simple explanations, tips, and an example flowchart.

What is a Stored Procedure?
A stored procedure is a group of SQL statements stored in the database. Instead of sending multiple queries from your application, you can just call one stored procedure — making the process faster and easier to maintain.

Example
CREATE PROCEDURE usp_GetCustomerOrders
    @CustomerID INT
AS
BEGIN
    SELECT OrderID, OrderDate, TotalAmount
    FROM Orders
    WHERE CustomerID = @CustomerID;
END


You can execute it like:
EXEC usp_GetCustomerOrders @CustomerID = 1001;

Why Performance Matters?
When you work with large data — millions of rows — even small inefficiencies can slow everything down.
Optimizing your stored procedures can:

  • Reduce CPU and memory load
  • Improve response time
  • Make the system more scalable
  • Lower the chance of deadlocks or timeouts

Best Practices for Writing High-Performance Stored Procedures

Let’s go step by step.

1. Use Proper SET Options

Start every stored procedure with:
SET NOCOUNT ON;

This stops the “(X rows affected)” message from returning, improving speed slightly and reducing unnecessary output.

2. Avoid SELECT *
Don’t select everything — it wastes memory and network bandwidth.

Bad
SELECT * FROM Orders;

Good
SELECT OrderID, CustomerID, OrderDate FROM Orders;

3. Use Proper Indexing
Indexes are like shortcuts in a phone contact list. Without them, SQL Server must scan every row.

Example
CREATE INDEX IX_Orders_CustomerID ON Orders(CustomerID);

Always check query performance using Execution Plans in SSMS.

4. Avoid Cursors (If Possible)
Cursors process rows one by one — very slow for large data.

Bad
DECLARE cur CURSOR FOR SELECT OrderID FROM Orders;

Good
Use set-based operations instead:
UPDATE Orders SET Status = 'Closed' WHERE OrderDate < '2024-01-01';

5. Use Temporary Tables or Table Variables Wisely
Temporary tables (#TempTable) can help when you have complex joins.

But don’t overuse them — they increase I/O on tempdb.

Use them only when needed:
SELECT * INTO #TempOrders FROM Orders WHERE Status = 'Pending';

6. Parameter Sniffing Control
Sometimes, SQL Server caches an execution plan based on the first parameter it sees, which may not work well for others.

To fix it
DECLARE @LocalCustomerID INT = @CustomerID;
SELECT * FROM Orders WHERE CustomerID = @LocalCustomerID;


This helps avoid performance issues due to parameter sniffing.

7. Use Transactions Carefully
Transactions ensure data consistency, but if they are too long, they can lock resources.

Use
BEGIN TRANSACTION;

-- Your statements
COMMIT TRANSACTION;


Don’t keep a transaction open for long-running logic or unnecessary steps.

8. Use TRY…CATCH for Error Handling
It’s important for clean rollback and error logging.

Example
BEGIN TRY
    BEGIN TRANSACTION;

    UPDATE Accounts SET Balance = Balance - 100 WHERE ID = 1;
    UPDATE Accounts SET Balance = Balance + 100 WHERE ID = 2;

    COMMIT TRANSACTION;
END TRY
BEGIN CATCH
    ROLLBACK TRANSACTION;
    EXEC spLogException ERROR_MESSAGE(), ERROR_LINE();
END CATCH;


9. Avoid Unnecessary Joins
Too many joins, especially on non-indexed columns, can slow performance drastically.

Keep joins only where needed, and always join on indexed keys.

10. Use Execution Plans and Statistics
Before finalizing your procedure:
SET STATISTICS IO ON;
SET STATISTICS TIME ON;


Then execute your procedure and check the output to understand how much time and I/O it consumes.
Also, view the execution plan (Ctrl + M in SSMS) to find table scans, missing indexes, or inefficient joins.

This flow shows how SQL Server optimizes execution by reusing cached plans whenever possible.

Example: Optimized Stored Procedure
Here’s a real-world optimized procedure example:
CREATE PROCEDURE usp_GetActiveOrders
    @StartDate DATE,
    @EndDate DATE,
    @CustomerID INT = NULL
AS
BEGIN
    SET NOCOUNT ON;

    BEGIN TRY
        SELECT O.OrderID,
               O.OrderDate,
               O.TotalAmount,
               C.CustomerName
        FROM Orders O
        INNER JOIN Customers C ON O.CustomerID = C.ID
        WHERE O.OrderDate BETWEEN @StartDate AND @EndDate
          AND (@CustomerID IS NULL OR O.CustomerID = @CustomerID)
          AND O.Status = 'Active'
        ORDER BY O.OrderDate DESC;
    END TRY
    BEGIN CATCH
        EXEC spLogException ERROR_MESSAGE(), ERROR_LINE();
    END CATCH;
END


Key Points

  • Used SET NOCOUNT ON
  • Controlled optional parameter
  • Filtered with indexes
  • Handled errors safely

HostForLIFE.eu SQL Server 2022 Hosting
HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.



About HostForLIFE.eu

HostForLIFE.eu is European Windows Hosting Provider which focuses on Windows Platform only. We deliver on-demand hosting solutions including Shared hosting, Reseller Hosting, Cloud Hosting, Dedicated Servers, and IT as a Service for companies of all sizes.

We have offered the latest Windows 2016 Hosting, ASP.NET Core 2.2.1 Hosting, ASP.NET MVC 6 Hosting and SQL 2017 Hosting.


Tag cloud

Sign in