SQL Transaction Logs - Export
chanakyajupudi
Member Posts: 712
in Off-Topic
Hey!
I am trying to understand how one exports Transaction Logs from MSSQL 2012/2014. I can use Backup-SQLDatabase using the Flag Log but seem to loose it because every time I run the command it seems to append to the same file.
Am I doing something wrong ?
I am not a DBA and may not understand the implications completely. I am trying to back the database in full at the end of the day and transaction logs every hour. For now. The intervals will change in production. I work in an almost zero loss environment.
Suggestions! or Advice!
Cheers
I am trying to understand how one exports Transaction Logs from MSSQL 2012/2014. I can use Backup-SQLDatabase using the Flag Log but seem to loose it because every time I run the command it seems to append to the same file.
Am I doing something wrong ?
I am not a DBA and may not understand the implications completely. I am trying to back the database in full at the end of the day and transaction logs every hour. For now. The intervals will change in production. I work in an almost zero loss environment.
Suggestions! or Advice!
Cheers
Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja
http://adarsh.amazonwebservices.ninja
Comments
-
philz1982 Member Posts: 978From what sql vendor?Read my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712SQL 2012/2014 from Microsoft. I am pretty sure Backup-SQLDatabase cmdlet works only for MS SQL.
Either ways I should have made it clear.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
philz1982 Member Posts: 978Ok first you need to backup the log then you can type it to csv format or use a kig viewer what is the purpose of log export are you trying to do compliance?
https://msdn.microsoft.com/en-us/library/ms179478.aspx
https://technet.microsoft.com/en-us/library/ms190440(v=sql.105).aspxRead my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712We are using AWS for the infrastructure. To ensure that I do not loose any data at all if I loose an Instance that hosts my SQL Instance. This being an Advanced Proof of Concept I cant introduce a Cluster or HA solution within SQL itself. I am trying to show that recovery can be achieved completely. I am doing a full backup every hour. I want to pick up transaction logs every 30 minutes. So I restore to a full back up available of the previous hour and the transaction logs of the last 30 minutes to avoid data losses.
My data writes are every 30 minutes. Thus the 30 minute transaction log intervals.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
philz1982 Member Posts: 978Ok so your using aws as a paas or a pure iaas for a sandboxed sql instance? Your purpose here is to test the capability to implement an incremental recovery prior to rolling your db into production. Guess I'm still trying to figure out your objective.
The log restore is easy to script based on the eventing engine acting as a trigger.Read my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
philz1982 Member Posts: 978Ok rereading your question. You need to look at scripting and incremental backup using a n+1 mirrored database structure. What is the velocity/volume of your data flows?Read my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712Using AWS as a pure IAAS. There are a lot of other instances that are resilient. I cant build resilience for the SQL Instance now. I am testing the capability of AWS to host this solution that is spread over 12 applications working in Sync.
The log backup script is what I am looking for that is trigerred every time an event occurs. I will add functionality to that script to upload them to S3 myself.
I have done the below for Full Backups.
param(
$serverName,
$backupDirectory,
$daysToStoreBackups
)
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $serverName
$dbs = $server.Databases
foreach ($database in $dbs | where { $_.IsSystemObject -eq $False })
{
$dbName = $database.Name
$timestamp = Get-Date -format yyyy-MM-dd-HHmmss
$backupFileName = $dbName + "_" + $timestamp + ".bak"
$targetPath = $backupDirectory + "\" + $backupFileName
$smoBackup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
$smoBackup.Action = "Database"
$smoBackup.BackupSetDescription = "Full Backup of " + $dbName
$smoBackup.BackupSetName = $dbName + " Backup"
$smoBackup.Database = $dbName
$smoBackup.MediaDescription = "Disk"
$smoBackup.Devices.AddDevice($targetPath, "File")
$smoBackup.SqlBackup($server)
Write-S3Object -BucketName sqlbackup-ams-vector -File $targetPath -Key $backupFileName -Region ap-southeast-2 -AccessKey -SecretKey
"backed & uploaded $dbName ($serverName) to $targetPath"
}
Get-ChildItem "$backupDirectory\*.bak" |? { $_.lastwritetime -le (Get-Date).AddDays(-$daysToStoreBackups)} |% {Remove-Item $_ -force }
"removed all previous backups older than $daysToStoreBackups days"
CheersWork In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
philz1982 Member Posts: 978https://www.google.com/url?sa=t&source=web&cd=4&ved=0CCQQFjAD&url=http%3A%2F%2Fdownload.microsoft.com%2Fdownload%2FD%2F2%2F0%2FD20E1C5F-72EA-4505-9F26-FEF9550EFD44%2FMicrosoft%2520SQL%2520Server%2520AlwaysOn%2520Solutions%2520Guide%2520for%2520High%2520Availability%2520and%2520Disaster%2520Recovery.docx&ei=BG0TVce6K9iyyATJ34CYBg&usg=AFQjCNFL4CuFNS3WKttrpd5_xE1JsdUkDw&sig2=uyX_b9op6Mv0405scgQU4w
Introduction to Database Mirroring - Part II - ExtremeExpertsRead my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712Volume for now is maybe 5 MB. This script has to allow for scaling vertically. Production loads can be 15 Gb.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
chanakyajupudi Member Posts: 712I cannot use Mirroring as part of the POC that I am doing. I will however use Mirroring and a large cluster to do achieve HA/DR. For now I need to export transactional logs and put them on S3.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
philz1982 Member Posts: 978Ok cant read configs on my phone but if the docs i linked you dont help ill look at it over the weekend as i am at customer showcases the next few days.Read my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712Cool! Thanks for the help.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja -
philz1982 Member Posts: 978No mirroring is going to make it much harder in a no loss scenario. Not even sure how that would work as once the instance goes down yout transaction buffer is going to fill up real fast with no failover option.Read my blog @ www.buildingautomationmonthly.com
Connect with me on LinkedIn @ https://www.linkedin.com/in/phillipzito -
chanakyajupudi Member Posts: 712Without disclosing a lot.
A regulatory body holds the data on an FTP Server. We poll every 30 minutes to ensure the FTP server is not full. But once polled data is removed from the FTP server to accomodate more data. The application then uses the data polled to perform certain actions and writes it to the DB.
As long as I have a transaction log of the data that was polled and written. I am good. For the time being.
When I go into production there will be no regulatory body FTP server. Then we intend to use HA/Mirroring/DR designs.Work In Progress - RHCA [ ] Certified Cloud Security Professional [ ] GMON/GWAPT if Work Study is accepted [ ]
http://adarsh.amazonwebservices.ninja