Galin Iliev's blog

Software Architecture & Development

SQL 2005 Database mirroring

I had interesting case last week - I had to setup SQL 2005 database mirror. First it sounded as piece of cake - I had to follow steps and recommendations as described in SQL Server Books Online (and here). My goal was to setup DB mirror of type High safety with automatic failover (synchronous).

I had to prepare SQL Instances as:

  1. Create a user (called SqlService in my case) with same password on all machines that will participate in the mirror. Give it enough rights to it. (I put it in Administrators group and denied local login. Later I set more granular security to it)
  2. Setup instances to run in this user context
    image
  3. Set trace flag 1400 as startup parameter to SQL instance
    image

 

Then I followed How to: Prepare a Mirror Database for Mirroring (Transact-SQL) and when it came to setup DB Mirror it was created... But although I had witness server I was unable to test automatic failover. More precisely I set a connection string of type (

"Data Source=myServerAddress;Failover Partner=myMirrorServer;Initial Catalog=myDataBase;Integrated Security=True;"

) to a simple web application but when I stopped the instance that hold Principal database the mirror database didn't became principal as expected.

Note: This test was done on SQL 2005 Standard Edition with SP2 on Windows 2003 Standard with the latest updates.

I decided to test this on SQL 2005 Standard (with no SP2) in Win2003. And it worked. I applied SP2 then and I as able to setup DB mirroring again and everything worked as expected - even automatic failover.

Bottom line: Somehow SQL 2005 SP2 break DB mirroring if installed right after SQL Server installation. This was my experience and I if someone has another experience with this I would be grateful if we can discuss here

Useful links:

Hope this helps!

Empty Database for Production

It is common to test data-driven applications with dummy data to test if everythign works fine. Once it is done the easiest way to start using in production is to recreate database. In case you have real lookup data in some of the tables (as predefined user, user groups, security rights and etc) you have to script them too.

There is another way though: just clear dummy data. As we know it can be done with T-SQL DELETE statement. Unfortunately this doesnt reset autonumber fields. So if you have Autonumber columns in your tables you have two choices:

  • run TRUNCATE TABLE statement - you can use this only if there are no contrains and foreign keys related with this table
    TRUNCATE TABLE  tblName
    
  • or delete all data and reset autonumber field (back to 1) with the following statements:
       1:  DELETE FROM tblName
       2:  DBCC CHECKIDENT (tblName, RESEED, 0)

Hope this helps

How to insert record to DB using Javascript and Astoria Service

I wrote several posts regarding Astoria Services and in this one I will demonstrate how to insert reocord in database using AJAX calls.

Read my previous post - How to consume Astoria Service - in order to get deeper into solution environment.

I extended Javascript code so there is one more function that add new record to database. As it is for demostration purposes excuse hard-coded values, please.

So in order to add new record you need:

  • to create and instanciate variable with appropriate JSON format. Be careful here! All required fields from DB must present and have valid values.
  • set HTTP POST verb to the request.
    wRequest.set_httpVerb("POST");
  • set URL that point to root of entity type (Customer in our case)
  • set Request's headers "Accept" AND "Content-Type" to "application/json"
    wRequest.get_headers()["Accept"] = "application/json";
    wRequest.get_headers()["Content-Type"] = "application/json";
  • set request body's content with serialized variable content.
    wRequest.set_body(Sys.Serialization.JavaScriptSerializer.serialize(customer));

and finally call Invoke() method.

Here is full function:

   1:  function createNewCustomer(custID, contactName, companyName)
   2:  {
   3:      //create new item
   4:      var customer = {__metadata: {Type:"Customer" },
   5:          CustomerID:custID ,ContactName: contactName, CompanyName: companyName, ContactTitle:"", Address:"", City:""
   6:          , Region:"", PostalCode:"", Country:"", Phone:"", Fax:"" };
   7:   
   8:      //save item to server using Astoria Service
   9:      var wRequest =  new Sys.Net.WebRequest();
  10:      wRequest.set_httpVerb("POST");
  11:      wRequest.set_url("http://galcho-pc:86/northwind.svc/Customers"); 
  12:      
  13:      wRequest.get_headers()["Accept"] = "application/json";
  14:      wRequest.get_headers()["Content-Type"] = "application/json";
  15:      wRequest.add_completed(function (response, eventArgs){
  16:          var statusText = response.get_statusText();
  17:          alert(statusText);
  18:      });
  19:      
  20:      wRequest.set_body(Sys.Serialization.JavaScriptSerializer.serialize(customer));
  21:      wRequest.invoke();
  22:  }

And the button that execute is defined with following HTML

<input type="button" value="New Customers" onclick="createNewCustomer('AAAA','Galin Iliev', 'Galcho.COM');" />

Hope you find it interesting.

UPDATE: Here is sample project:

AstoriaTests.zip (191.09 KB)

In order to run sample project do following:

  • make sure you installed the packages stated here.
  • extract files and modify Connection string NorthingEntites in web.config to match your SQL server that run Northwind DB
  • when running the sample make sure the URL in browser's address bar contain same hostname as in astoriaUrl JS variable (Default.aspx line 29) otherwise you will get "Access Denied!" error from browser single origin restriction. 

How to consume Astoria Service

A week ago I wrote about setting-up Astoria Service. Now is time to expose real power of Astoria - consuming data from client script.

Once you have setup your Web Data Service you're ready to write ASP.NET AJAX script to consume it. Basically you need these things:

  • set Requiest's header "accept" value of "application/json" so the data can be used directly in JS
  • call appropriate URL
  • and, of course, process data

The rest is done form Astoria team :) Kudos for them

this is how the code looks like:

   1:  function loadCustomers(){
   2:      var webRequest = new Sys.Net.WebRequest();   
   3:      
   4:      webRequest.set_url("http://galcho-pc:86/northwind.svc/Customers");   
   5:      
   6:      webRequest.get_headers()["accept"] = "application/json";
   7:      
   8:      webRequest.add_completed(function (result){
   9:          var cust = result.get_object();   
  10:          for (i = 0; i < cust.length; i++){   
  11:              addCustomer(cust[i].ContactName, cust[i].CompanyName);
  12:          }  
  13:      });   
  14:      
  15:      webRequest.invoke(); 
  16:  }
  17:   
  18:  function addCustomer(contactName, companyName){
  19:      $get("customers").innerHTML += "<div>" + contactName + "</div>" + companyName + "<br/>";
  20:  }

There main function is loadCustomers(). It does all I described above. And there is a helper function that fills a DIV's innerHTML property with approperiate content.

Here is the HTML:

   1:  <div id="customers">
   2:      <span style="font-size: large; font-weight: bold; border: solid 1px black; width: 100%;
   3:          display: block;">
   4:          <div>
   5:              Contact Name</div>
   6:          CompanyName</span>
   7:  </div>

And the result looks like:

 

You can download full ASPX file from here:

Astoria-Simple-Test.zip (0.94 KB)

More sophisticated article for Astoria consumer app is coming. It is based on MIX AJAX classed that provide infrastructure for CRUD operations.

Getting started with Astoria (screenshots story)

Today I will lead you through the process of creating Astoria service. It is pretty easy as you’ll see.  In my previous post I described the required packages to run Astoria.

There are mainly to stages:

  • Create and setup Entity Framework data Model
  • Create Astoria Service

Once you have them you can start Visual Studio 2008 Beta 2

Then select form the menu Project -> Add new Item… and select ADO.NET Entity Data Model

Then we follow the EF wizard to Generate Model from database

We select accordingly the connection to MS SQL Server which has AdventureWorks dabatase.

On next screen we select objects that will be added to EF Schema and click Finish button:

And we have the schema ready to use:

Now we are ready to setup Astoria Service:

Select form the menu Project -> Add new Item…

When we have class generated all we have to done is to set Context Object for EF Model. We enter AdventureWorksModel.AdventureWorksEntities and compile.

And this is all! We are ready to use it. It was pretty easy,  wasn’t it?

Now let’s test it. All we have to do is loading AdventureWorks.svc from the browser:

On this screen we see all entities in the schema. We could navigate further with just changing URL:

You could play with some other Astoria specific queries like those (change your host:port to match yours):

There is one very good tool that you can use http://astoria.sandbox.live.com/tools/raw.htm . On order to use the page you must host it on same application as Astoria service otherwise browser single origin check will stop the requests. So Save the page and add it to your project


In next post I will show you how to consume the data from Javascript

Astoria CTP Refresh for EF Beta 2

Pablo Castro annonced there is CTP Refresh of Astoria bits. Astoria is very cool project based on .NET 3.0 (WFC), Entity Framework to provide pure HTTP access to SQL Server data. This means that we will be able to pull data using... Javascript 

In order to use you need:

I am preparing a (short) article that shows how to leverage data from Javascript and ASP.NET AJAX... Stay tuned and I will annonce it here

Look under the hood of SQL Server Storage engine

We, software engineers, have interesting profesion, don't we? The result of our job should make life of the customers easier. But there is something else that keeps us in front of the screen. And this is to make things work in our way. Especially when we are able to find out how the internals work we feel like in the heaven :).

Beside the sake of cool it is very useful to know what's in under the hood. Knowing this we could create more optimized products, make things flying...

There is a tool called SQL Internals Viewer that allows look into the SQL Server storage engine and see how data is physically allocated, organised and stored. There is no need to say how much we can benefit form this knowledge.

But hey.. that's not all. There is also Resource section in which you can dive and read about internals...

XML in SQL 2005 and SqlCommand

Although XML has good support in SQL 2005 I haven't used it heavily 'till today. I had to optimize an application that uses a table with two XML columns. These columns were of type text and filter operations against them was costly.

Let's say query like this one takes about 2 secs for execute (in table with only 2000 rows):

SELECT *
FROM tableName
WHERE xmlColumn like '<AccountID>23</AccountID>'

Although this is not the best way to acheve it it worked... slowly :) So it worths to invest some time to make the changes.

I know the columns should be converted to type XML but I didn't know if this would speed up the process either how to manipulate xml from T-SQL.

in order to explain better how to work with XML we will use the following XML:

<root>
    <
node id="1" value="test1" />
    <
node id="2" value="test2" />
    <
AccountID>23</AccountID>
</
root>

Step 1 - Manipulate with XML from T-SQL

There are some good pages in SQL 2005 Books Online called xml Data Type Methods. They explain well how to work with xml.. So briefly:

in order to get value as result from query you'll need query like this one

SELECT xmlCollumn.query('data(/root/node/@id)[1]')
FROM tableName

In order to get value from XML we need two thigs- XQuery that specifies which nodes to be returned (@id specify the attribute) and in square brackets which value to be returned if there are more than one nodes with same name (as in our case).

For filtering by value within XML this query would help

SELECT *
FROM tableName
WHERE xmlCollumn.value('data(/root/AccountID)[1]', 'int') = 23

Note: By changing type from text to xml and modifying SQL statements I was able to execute same query for 0.0001 secs instead of 2 seconds. Seems like good progress...

Step 2 - Using it from .NET Application

So far so good. Let's use it in .NET Application. This one seems easy

   1:  string strXml="";
   2:  SqlConnection cn = new SqlConnection("connection strin here");
   3:  SqlCommand cmd = new SqlCommand(@"SELECT *
   4:  FROM tableName
   5:  WHERE xmlColumn.value('data(/root/AccountID)[1]', 'int') = 2", cn);
   6:  cn.Open();
   7:  try
   8:  {
   9:      SqlDataReader rd = cmd.ExecuteReader();
  10:      if (rd.Read())
  11:      {
  12:          strXml = rd[1].ToString();
  13:      }
  14:      rd.Close();   
  15:  }
  16:  catch (SqlException ex)
  17:  {
  18:      MessageBox.Show(ex.Message);
  19:  }
  20:  finally
  21:  {
  22:      cn.Close();
  23:  }
  24:   this.txtContent.Text = strXml;

I expected this to be all but no luck... I got funny exception:

SELECT failed because the following SET options have incorrect settings: 'ARITHABORT'. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or query notifications and/or xml data type methods.

So including "SET ARITHABORT ON;" above every SQL statement used in the code solved the issue.

Overal speed-up in the application was 19 seconds. I started from 20 secs for performing certain operation and same operation was executed for 1 sec. And this was just with converting columns (and queries) to facilitate XML in SQL side... Seems like a good progress

Hope this helps

Entity Framework for DBAs

There is a lot of noise these days around Visual Studio 2008 and fancy LINQ to SQL and Entity Framework and discussions how they will change the way data driven applications is done.

But somethign was missed 'till now - How this would affect database administrators (DBAs)?  Erick Thompson post a good explanation about these technologies that targets DBA and explain what will be changed on their daily job.

Go and read ADO.NET Blog post - it worths.