Restoring into an Azure Managed Instance from SQL 2014

We are in the process of setting up our azure environment, and using managed instances is something that lets us efficiently move from an on premise legacy world to a similar setup inside Azure. I’ve just completed my first migration of moving from SQL Server 2014 On premise, onto a SQL Managed Instance running SQL Azure Database V12. There are several functional steps that need to be followed succintly to complete the migration of data which i’ll try to cover off below.

Creating an Azure Storage Container

Similar to S3, Azure has storage accounts which can store multiple blob containers (folders). The general heirarchy of this is:

Subscription
  --Resource Group
    --Storage Account
      --Storage Container

The Storage Account can be set to only allow certain ranges of IP’s (IP Whitelist) to access the data, and for best practice this should be set.

The blob storage should be set to type private to prevent anonymous access – depending on what is being stored, but for databases i’d say this should be the default position.

Backup a SQL 2014 DB to Azure Storage Blob

To be able to talk to the blob storage, we need to provide a credential for SQL Server to use, which has a login access key to the storage account.

USE master 
CREATE CREDENTIAL [backups] -- name of the credential 
WITH IDENTITY='my_storage_account' -- storage account name. 
, SECRET = 'one_of_the_two_storage_keys_for_the_account' -- this is the access key token 
GO

NOTE: If you try to use a Shared Access Signature (SAS) on SQL Server 2014, it will fail to decode the signature when trying to use it. Follow this document to create a SQL 2014 Credential

Error if using SAS on SQL 2014:

Msg 3298, Level 16, State 2, Line 11
Backup/Restore to URL device error: Error while decoding the storage key.
Msg 3013, Level 16, State 1, Line 11
BACKUP DATABASE is terminating abnormally.

To see the credentials that are in place on SQL Server:

select * from sys.credentials

At, this point we can start referencing the credential to backup to a URL based location:

BACKUP DATABASE mydatabase 
TO URL = 'https://<storage_account>.blob.core.windows.net/<storage_container>/mydatabase.bak'
WITH CREDENTIAL = 'backups'

Restore a SQL 2014 Backup to SQL Azure Managed Instance

To be able to allow a SQL Azure database to talk to blob storage, we need to provide a credential using a Shared Access Signature (SAS).

To create a SAS, you need to generate a signature from within the Storage Account in the azure portal. Set an expiry on the SAS to give some added security.

NOTE: Remember to remove the intial ? from the generated SAS key otherwise the key will be invalid.

USE master 
CREATE CREDENTIAL [https://<storage_account>.blob.core.windows.net/<storage_container>] -- this name must match the container path, start with https and must not contain a trailing forward slash. 
WITH IDENTITY='SHARED ACCESS SIGNATURE' -- this is a mandatory string and do not change it. 
, SECRET = 'sas_key' -- this is the shared access signature token 
GO

Run the restore:

RESTORE DATABASE mydatabase 
FROM URL = 'https://<storage_account>.blob.core.windows.net/<storage_container>/mydatabase.bak'

NOTE: Notice that there are no WITH clauses in the restore as these are not supported in SQL Azure. Normally you would have a STATS clause to monitor the restore, but this can be done either via sys.dm_exec_requests – precent_complete column or by downloading sp_whoIsActive – check out Brent Ozar’s Post on this

If you’re happy with the restored database, then it can be changed to 2017 compatibility if the application for the database supports it.

USE [master]
GO
ALTER DATABASE [mydatabase] SET COMPATIBILITY_LEVEL = 140
GO

 

Advertisements

Check compressed database backup ratio

If running compressed backups natively in SQL Server, then you can check teh ratio of compressed using the following script:

SELECT backup_size/compressed_backup_size 'Ratio', backup_size/1024/1024/1024 'Original Size GB', compressed_backup_size/1024/1024/1024 'Compressed Size GB', 
backup_start_date, backup_finish_date FROM msdb..backupset
where database_name = 'Database_Name'
and backup_size/compressed_backup_size > 1
order by backup_start_date desc;

Cannot execute as the database principal because the principal "##MS_PolicyEventProcessingLogin##"

Following error continuously in the agent log after restoring the msdb database

The activated proc '[dbo].[sp_syspolicy_events_reader]' running on queue 'msdb.dbo.syspolicy_event_queue' output the following: 'Cannot execute as the database principal because the principal "##MS_PolicyEventProcessingLogin##" does not exist, this type of principal cannot be impersonated, or you do not have permission.'

This is normally down to an orphaned login for the ‘##MS_PolicyEventProcessingLogin##’ login

To fix, you can first make sure if an orphaned login is in fact the cause:

USE master
SELECT * FROM sys.database_principals 
WHERE [name] = '##MS_PolicyEventProcessingLogin##'
go
USE msdb
SELECT * FROM sys.database_principals 
WHERE [name] = '##MS_PolicyEventProcessingLogin##'
go

This should show that the SID for the login in the master database is different from the SID in the msdb database

image

Then run the following to correct the orphaned login:

use [master]
go
exec sp_change_users_login 'Auto_Fix', '##MS_PolicyEventProcessingLogin##';
go
use [msdb]
go
exec sp_change_users_login 'Auto_Fix', '##MS_PolicyEventProcessingLogin##';
go

By rerunning the check, it should show that the SID is now the same in both database_principals tables

USE master
SELECT * FROM sys.database_principals 
WHERE [name] = '##MS_PolicyEventProcessingLogin##'
go
USE msdb
SELECT * FROM sys.database_principals 
WHERE [name] = '##MS_PolicyEventProcessingLogin##'
go

image

To clean up, cycle the agent error log

use msdb
go
sp_cycle_agent_errorlog

Working with Filegroups

Adding a file and filegroup to a database

USE [master]
GO
ALTER DATABASE [DataArchive] 
ADD FILEGROUP [Archive_2015]
GO
ALTER DATABASE [DataArchive] 
ADD FILE ( NAME = N'DataArchive_2015', 
FILENAME = N'G:\DataArchive_2015.ndf' , SIZE = 104448KB , FILEGROWTH = 1048576KB ) 
TO FILEGROUP [Archive_2015]
GO

Making a filegroup default for new data

--Make a filegroup default
ALTER DATABASE core_AdvertDataArchive
MODIFY FILEGROUP Archive_2011 DEFAULT

Checking which tables are located in which filegroups

--Return which filegroup a table is located in
SELECT object_name(i.[object_id]) as Name_of_Object,
i.name as Index_Name,
i.type_desc as Index_Type,
f.name as Name_of_Filegroup,
a.type as Object_Type,
f.type,
f.type_desc
FROM sys.filegroups as f 
INNER JOIN sys.indexes as i 
 ON f.data_space_id = i.data_space_id
INNER JOIN sys.all_objects as a 
 ON i.object_id = a.object_id
WHERE a.type ='U' -- User defined tables only
--AND object_name(i.[object_id]) ='employee' -- Specific object
GO

winhttp.winhttprequest.5.1 the server name or address could not be resolved

Problem: When using POST or GET with sp_OAMethod in SQL Server, you receive the following error:

Error Source Description
0x80072EE7 WinHttp.WinHttpRequest The server name or address could not be resolved 

An example of a HTTP Post function is provided below:

Create function GetHttp
(
    @url varchar(8000)      
)
returns varchar(8000)
as
BEGIN
    DECLARE @win int 
    DECLARE @hr  int 
    DECLARE @text varchar(8000)

    EXEC @hr=sp_OACreate 'WinHttp.WinHttpRequest.5.1',@win OUT 
    IF @hr <> 0 EXEC sp_OAGetErrorInfo @win

    EXEC @hr=sp_OAMethod @win, 'Open',NULL,'GET',@url,'false'
    IF @hr <> 0 EXEC sp_OAGetErrorInfo @win

    EXEC @hr=sp_OAMethod @win,'SEND'
    IF @hr <> 0 EXEC sp_OAGetErrorInfo @win

    EXEC @hr=sp_OAGetProperty @win,'ResponseText',@text OUTPUT
    IF @hr <> 0 EXEC sp_OAGetErrorInfo @win

    EXEC @hr=sp_OADestroy @win 
    IF @hr <> 0 EXEC sp_OAGetErrorInfo @win 
    
    RETURN @text
END

The function is then called with a URL Variable:

DECLARE @url VARCHAR(500)
SET @url = 'http://google.co.uk'
exec GetHttp @url

Although there are many reasons for the error, if the simpler solutions have not worked (eg: Making sure the URL is correct, that the DNS resolves, Firewall is correctly configured) then the issue may be a hidden proxy.

WinHTTP uses its own proxy and depending on how your servers are configured, it may need to be set or unset.

For me, the proxy was enabled and i needed direct access.

Open an Elavated Powershell window:

> netsh winhttp show proxy

Current WinHTTP proxy settings:

    Proxy Server(s) :  proxy.xxxxxxxxx.co.uk:8080

    Bypass List     :  (none)

> netsh winhttp reset proxy

Current WinHTTP proxy settings:

    Direct access (no proxy server).

> netsh winhttp show proxy

Current WinHTTP proxy settings:

    Direct access (no proxy server).

Should I be using COM Objects post SQL 2000??!?

We did this for an old system, but my recommendation would be to use SQL Server CLR’s to access REST Services.

Great post here on MSDN of how to do this – http://blogs.msdn.com/b/sqllive/archive/2008/06/18/accessing-rest-based-web-services-using-sql-clr.aspx

Delete a job from multiple servers using SSMS Server Groups

For more detail on Server groups see http://blog.hoegaerden.be/2012/11/24/ssms-connect-to-several-servers-in-one-click-okay-two/ as that explains it in more detail.

The issue is that as Job IDs are unique across servers, a standard sp_delete_job query wouldnt work as we first need to know the Job ID.

The below sample is to delete the same named job across multiple servers:

USE [msdb]
GO

declare @jobid varchar(38)

select @jobid = job_id from sysjobs where name = 'Job Name'

/****** Object:  Job [Monitoring - Backup Stats]    Script Date: 31/12/2014 09:28:37 ******/
EXEC msdb.dbo.sp_delete_job @job_id=@jobid, @delete_unused_schedule=1
GO