Skip to content

Instantly share code, notes, and snippets.

Last active October 10, 2022 19:49
Show Gist options
  • Save sqldeployhelmet/ff256a10dc1862a8d1575e64c3dfb9ee to your computer and use it in GitHub Desktop.
Save sqldeployhelmet/ff256a10dc1862a8d1575e64c3dfb9ee to your computer and use it in GitHub Desktop.
Updating SQL Data Types with DBATools
Author: josh smith
Date: 2022-09-22
Purpose: The challenge of converting integers to bigintegers was mainly one of time and simply
altering the column data type led to _extremely_ long runtimes that could be mitigated
by creating copy of the table from scratch with the correct data type and then transfering
the data in. This script leverages the dbatools PowerShell module ( to do just that.
1. Tables in the applicaton schema are retrieved from the initial SQL query below (taken from process
used to fix the test environment) and then each table definition is retrieved, saved to a
local temp file so a copy of the table can be created once the table name and any necessary
dependent objects are given temporary names (in this case appending "_BIGINT" to the end.)
The table is then created.
2. If any records exist in the original table dbatools is again used
to bulk copy data into the new table otherwise the script proceeds immediately to step 3.
3. The original table is dropped and the new table is renamed to replace it (all objects that
were given the name xxxxx_BIGINT are renamed to just xxxxx).
This process is desirable for a couple of reasons:
* it is flexible and can account for any tables that were created in production after
the database was taken and restored into our testing environment.
* The total run time was "just" 18.5 hours for all tables including the largest which
has over 500 million rows and ran for 12 hours alone when we attempted to simply update
the table directly.
* lastly PowerShell is much better suited for the string manipulation needed for scripting out and
renaming objects. Particularly since dbatools has built in tools to do this.
$sqlinstance = '<target sql instance>'
$database = '<target database name'
$sql = "WITH [results]
AS ( SELECT DISTINCT [t].[name] AS [TableName]
, [s].[name] AS [SchemaName]
, [p].[rows]
FROM [sys].[tables] AS [t]
INNER JOIN [sys].[indexes] AS [i] ON [t].[object_id] = [i].[object_id]
INNER JOIN [sys].[partitions] AS [p] ON [i].[object_id] = [p].[object_id]
INNER JOIN [sys].[allocation_units] AS [a] ON [p].[partition_id] = [a].[container_id]
LEFT OUTER JOIN [sys].[schemas] AS [s] ON [t].[schema_id] = [s].[schema_id]
WHERE [t].[is_ms_shipped] = 0
AND [i].[object_id] > 255
/* Add ot WHERE clause as needed to limit the results as needed '*/
SELECT [r].[TableName], [r].[SchemaName], [r].[rows]
FROM [results] AS [r]
WHERE [r].[tableName] IN ('dtxdpform')
ORDER BY [r].[rows] ASC, [r].[TableName];"
these set SMO options to ensure all the necessary related objects
(indexes, keys, constraints, etc.) are included in the table script
and preserved when switching to the new table.
$options = New-DbaScriptingOption
$options.ClusteredIndexes = $true
$options.Default = $true
$options.FullTextIndexes = $true
$options.Indexes = $true
$options.NonClusteredIndexes = $true
$options.SchemaQualify = $true
$options.ScriptSchema = $true
$options.Triggers = $true
$options.DriAll = $true
$tables = Invoke-DbaQuery -SQLInstance $sqlinstance -Database $database -Query $sql
<# some stats tracking #>
$startTime = Get-Date
$tableErrors = @()
$iCount = 0
$total = $tables.Count
<# for each table starting with the least amount of rows until they are all complete: #>
foreach ($bt in $tables) {
<# script out original table #>
$t = Get-DbaDbTable -SQLInstance $sqlinstance -Database $database -Schema $bt.SchemaName -Table $bt.TableName
$script = $t | Export-DbaScript -FilePath $localPath -ScriptingOptionsObject $options -PassThru
$tblName = $bt.TableName
$schemaName = $bt.SchemaName
if ($script -match '\[int\]') {
$tblName = $bt.TableName
$tblReplace = "\.\[$tblName\]"
$tgtName = ".[$tblName" + "_BIGINT]"
$script = $script -replace '\[int\]', '[bigint]'
$script = $script -replace $tblReplace, $tgtName
<# ensure unique names for constraints: #>
while ('' -ne ($script -match '(?!.+\[.+\BIGINT\].+)CONSTRAINT \[.+\]')){
$replace = ($script -match '(?!.+\[.+\BIGINT\].+)CONSTRAINT \[.+\]')[0].ToString().Trim()
$replace = $replace -replace '\[', '\['
$replacement = $replace -replace '\]', '_BIGINT]'
$replace = $replace -replace '\]', '\]'
$replacement = $replacement -replace '\\\[', '['
$script = $script -replace $replace, $replacement
<# write the script back to the temp file to use to create the new table #>
try {
Invoke-DbaQuery -SqlInstance $sqlInstance -Database $database -Query $script -EnableException
$tmpTable = $true
catch {
$tableErrors += "$tblName BIGINT Table not created."
$copyData = $false
$tableDrop = $false
<# only continue if the new table was created _and_ there are rows to process #>
if ($t.RowCount -gt 0 -and $tmpTable -eq $true) {
<# set parameters for bulk copying data from the original table to the replacement: #>
$params = @{
SqlInstance = $sqlInstance
Destination = $sqlInstance
Database = $database
DestinationDatabase = $database
Table = "[hsi].[" + $tblName + "]"
DestinationTable = "[hsi].[" + $tblName + "_BIGINT]"
KeepIDentity = $true
KeepNulls = $true
BatchSize = 10000
Truncate = $true
try {
Copy-DbaDbTableData @params -EnableException
$copyData = $true
$tableErrors += "$tblName data not copied!"
<# set copydate to true if the new table was created but there are no rows #>
elseif ($tmpTable -eq $true)
$copyData = $true
<# if we successfully copied the data to the new table drop the old table #>
if ($copyData -eq $true) {
try {
$qry = "DROP TABLE [" + $schemaName + "].[" + $tblName + "];"
Invoke-DbaQuery -SqlInstance $sqlinstance -Database $database -Query $qry -EnableException
$tableDrop = $true
catch {
$tableErrors += "$tblName not dropped!"
<# if we successfully dropped the old table rename the new one to replace it #>
if ($tableDrop -eq $true){
try {
<# rename all the objects we appended _BIGINT to #>
$qry = "DECLARE @originalName VARCHAR(100), @updName VARCHAR(100)
DECLARE cursor_name CURSOR FAST_FORWARD READ_ONLY FOR SELECT SCHEMA_NAME ([schema_id]) + '.' + [name] AS [objName]
, LEFT([name], LEN ([name]) - 7) AS [updName]
FROM [sys].[objects]
AND [type_desc] NOT IN ( 'SYSTEM_TABLE' )
OPEN cursor_name
FETCH NEXT FROM cursor_name INTO @originalName, @updName
EXEC sp_rename @originalName, @updName
FETCH NEXT FROM cursor_name INTO @originalName, @updName
CLOSE cursor_name
DEALLOCATE cursor_name;"
Invoke-DbaQuery -SqlInstance $sqlinstance -Database $database -Query $qry -EnableException
catch {
$tableErrors += "$tblName not renamed!"
else {
Write-Output "No integers in $tblName (skipping)"
$tSpan = New-TimeSpan -Start $startTime -End (Get-Date).DateTime
Write-OutPut "Processing took " + $tSpan.TotalMinutes.ToString() + " minutes!"
Write-OutPut "Processed $iCount tables out of $total."
Write-Output $tableErrors.Count + " errors encountered!"
if ($tableErrors.Count -gt 0) {
# write out the tables & steps we had issues with:
$tableErrors >> Errors.txt
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment