What is the CPU usage of my Azure SQL Server?

The other day, I was stuck on a simple question. I could see the DTU metric in the Azure dashboard, but I needed a quick way to see the current CPU utilization.

I went through a lot of sites, but the answer was staring me in the face!

Just click the “DTU” section, and it will take you to a new page. Now, you can choose what metrics you need to be displayed. One of these is CPU utilization.

Just click that and you can see the current CPU usage in %age.

Such a simple solution…

Posted in Uncategorized | Leave a comment

Using Powershell to delete lines from a text file based on certain keywords

In continuation with the previous post, there was an issue that was encountered in a certain text file. There was a lot of redundant data in the output file. Let’s say like this :

“Error 11111 : The check finished without errors”

“Error 2222 : Login succeeded for user AMAM”

We needed to eliminate lines from the output file that had these lines. So how do you do that?

You can add this line to your file :

Get-Content $OutputFile | Where-Object {$_ -notmatch ‘finished without errors’ -And $_ -notmatch ‘Login succeeded for user’ } | Set-Content Output_2.txt

So as you can see, the idea is to use the $_ -notmatch operator followed by your string. To combine multiple phrases, use the -And operator.

The only issue that I faced was that I had to output to a separate text file called Output_2.txt. The command did not work in my parent text output file.

Thx to this site :

http://www.computerperformance.co.uk/powershell/powershell_match.htm#Example_4: More_Comparitors_-NotMatch_also_-And_

Posted in Uncategorized | Leave a comment

Using Powershell to run scripts against multiple servers

Recently I started digging up Powershell. I am amazed at how elegant and powerful it is, not to mention it’s speed !

To start, I browsed to Start > All Programs > Accessories > Windows Powershell > Windows Powershell where we get the Powershell (PS) command prompt.

You can navigate to the folder of your choice using the common “cd ” format of DOS. For example, to get to G:\Powershell_test_folder, just type : “cd G:\Powershell_test_folder”. I have saved some files in this location, hence the reason to navigate here.

Now I have saved the names of all my target SQL Servers in a text file at that location called AllServers.txt. You can simply add the names one by one. I used the “IP,port_number” format like this :




Now we can add the query that we want to run against these n servers in a Query.sql file at that location. Lets say we want the database names in that server. So save this query in the Query.sql file :

select name from sys.databases 

Now we will run a Powershell script to run this query against all the servers.

But before we do that, we will need to add these two lines in the Powershell command prompt :

PS > Add-PSSnapin SqlServerCmdletSnapin100

PS > Add-PSSnapin SqlServerProviderSnapin100

Here is the PS script (lets call it PScript1.ps1):


$ExecuteQuery= Get-Content -path $QueryPath | out-string

“Results — > `r`n`r`n” > $OutputFile

FOREACH($server in GC “G:\Powershell_test_folder\AllServers.txt”)
“——————————-” >> $OutputFile
$server >> $OutputFile
“——————————-” >> $OutputFile
invoke-sqlcmd -ServerInstance $server -query $ExecuteQuery -querytimeout 60000 | ft -autosize | out-string -width 4096 >> $OutputFile


Ok, so how does the script work ? We define three variables using the $ symbol :

  1. $QueryPath : This shows the location of the query file.
  2. $OutputFile : This is where the results of your query appear.
  3. $ExecuteQuery : This will read (get-content or gc) your input query and write the “Results” string to your output file.

Now the FOREACH command will iterate through your servers in the AllServers.txt file. It does this by reading (GC or Get-Content) the AllServers.txt file. It then outputs the name of your server first, followed by the results of your query (in this case the list of databases on your server).

The query is executed using the Invoke-Sqlcmd command let. This is just like running the usual sqlcmd command. The ” | ft” means it will output the result in a formatted table output.

In my case, I used a more complex query to collect the details of the errorlogs. The following query used in the Query.sql file will run the xp_readerrorlog command to find wherever the ‘ERROR’ or ‘FAIL’ statements were recorded in the SQL Server Errorlogs for the past two days :

declare @Start varchar(30);
declare @End varchar(30);
set @Start=convert(varchar(30),getdate()-3,25);
set @End=convert(varchar(30),getdate()+1,25);
exec master..xp_readerrorlog 0, 1, N’error’, NULL, @Start, @End
exec master..xp_readerrorlog 0, 1, N’fail’, NULL, @Start, @End

To execute the PS script, simply type this at your PS command prompt :

PS > .\PScript1.ps1

Powershell is fast and awesome! There is still a lot to learn and study about this. Here are some resources that helped me (thx to Pinal Dave, Raushan,  Sandip Shinde, and others) :



SQL Sever – PowerShell for Executing a SQL Query on Multiple Servers





Posted in Uncategorized | Leave a comment

Importing data from Oracle database to SQL Server using SSIS

Let’s say we have a couple of tables in Oracle that need to be imported to SQL Server. Note that only certain tables need to be imported, and not the entire schema (packages, functions, etc).

One of the easiest ways to do this is to import using SSIS. Here, I will list the steps taken to import from Oracle 11g to SQL Server 2012 database using SSIS installed with SQL Server 2008 R2.

Launch BIDS and create a new Integration Services Project. Create two new Connections in the Connection Manager. One of them should connect to Oracle. I used “Microsoft OLE DB Provider for Oracle”. Your Oracle connection should look like this :


As you can see above, you need to provide the correct Server name (Oracle database instance) and your username/password to connect to the Oracle schema.

Next, you need to create a SQL Server connection. I used the “SQL Server Native Client 10.0” provider for this. Here is a screenshot :


So again, provide the instance detail (including the port number) in ServerName, and choose either Windows or SQL Server authentication to connect to the server. Once authenticated, the databases drop-down will be populated, and you can choose your SQL Server database to import the Oracle tables to.

Now let’s import the data. Drag a new Data Flow Task into the Control Flow pane. Double click to get to the Data Flow pane.



Your “OLE DB Source” is the Oracle database, so double click it and assign the Oracle connection that you created earlier. Once it connects to the Oracle db, it will load all the tables and views in the table drop-down. Choose the table that you want to import. This may take a while if there are a lot of Oracle schemas on the source database. Now it should look like this:


Now double click the “OLEDB Destination”. This is your SQL Server target database. Provide the SQL Server connection that you gave earlier. You can choose to create a new table to import the data to :


In general, you dont need to change any default values.

You need two more steps to complete the process:

  1. Assign AlwaysUseDefaultCodePage=TRUE in the Source properties :


2. Set “Run64BitRuntime” to False in the main Project Properties pane. Apparently, this is because there is some issue in the 32-bit drivers vs the 64-bit drivers. You can check more here : https://blogs.msdn.microsoft.com/amitlale/2007/12/11/ssis-64-bit-setting/


That’s it. Now run the package and it should seamlessly import the Oracle tables to your SQL Server database.

Posted in Uncategorized | Leave a comment

Using “WITH VALUES” in a DEFAULT clause

The WITH VALUES clause is simple to understand.

Suppose we have a table called Orders  :


Now we add a column with a default value :

  • alter table orders add newcol varchar(100) default 10
  • insert orders (orderid,orderdate,custid) values (101,getdate(),1)

Now query the table again :


You can see that the new column has default values only for the next new row. So the original row, ie with orderid 100, does not have the new default value applied.

Now drop the column and add it again WITH VALUES :

  • alter table Orders drop column newcol;
  • alter table orders add newcol varchar(100) default 10 with values

Now query the table :


You can see that specifying WITH VALUES has updated all rows in the table, including the first one with orderid 100.

Hope this clarifies the use of WITH VALUES.


Posted in Uncategorized | Leave a comment

Primary Database xxx has active log shipping secondary database(s). Drop the secondary database(s) first.

Ran into a bit of a glitch with Log Shipping today.

As you know, there are three simple documented steps to remove log shipping from your server:

  1. Remove log shipping on primary using :

    EXEC master.dbo.sp_delete_log_shipping_primary_secondary

  2. Remove log shipping on secondary usingsp_delete_log_shipping_secondary_database
  3. Remove log shipping on primary using : sp_delete_log_shipping_primary_database


Now, there was a problem here. The second step should ideally drop the database, but it didnt. So when it came to the 3rd step, it threw up this error :

Primary Database xxx has active log shipping secondary database(s). Drop the secondary database(s) first.

So the secondary database was dropped and this command was executed again. It still failed. Something was not right.

The issue is that the information about this database still exists in a system table. You need to query that and remove the record:

select * from msdb.dbo.log_shipping_primary_secondaries

Once you delete the record, you can safely run the 3rd step and it will succeed.



Posted in Uncategorized | Leave a comment

SQL Server Export/Import Wizard and the missing Identity property (Or : How I learnt not to trust the Wizard)

SQL Server comes equipped with a fantastic wizard for importing/exporting data. It’s so simple : just right click on your database > Tasks > Export Data (or Import Data). What could be simpler, right ?

No, sir, this is one deceptive tool…

Suppose I create a table like this : just an integer column with an identity value, and a varchar column :

create table Birdman (
bid int identity(1,1),
bname varchar(100))

Lets say I want to export this table to another database. I fire up the Export Wizard and run through the steps. In the “Column Mappings” section, I dutifully click the “Enable identity insert” checkbox as I know this will involve an identity column :

Identity checkbox

And it succeeds.

Now script out the newly created table in the other (target) database, and see what you get :

CREATE TABLE [dbo].[Birdman](
[bid] [int] NOT NULL,
[bname] [varchar](100) NULL

Dude, where’s my identity property ?????

Now as it turns out, the SQL Server Import/Export Wizard has a nasty way of ignoring the Identity setting when it copies a table across. It also conveniently ignores Constraints, Indexes, etc. In short, it is thoroughly unreliable to transfer database objects with this method.

A safer approach would be to script out the object and create it on the target. Once the table is created, use the Import/Export Wizard to copy data across databases.

Hope this helps someone out there !

Posted in Uncategorized | Leave a comment