Categories
Welcome Center
Education & Development
Cyber Security
Virtualization
General
Certification Preparation
Project Management
Posts
Groups
Training Resources
Infosec
IT & Security Bootcamps
Practice Exams
Security Awareness Training
About Us
Home
General
Off-Topic
Database clean up
N2IT
Any recommended best practices anyone care to share? Right now I am going through ~100 tables in a DW looking for usage of the tables and to see if any tables are just floating out there not being used. I found some seek, scan and update queries to see the activity of the table. Any other recommendations while going through this activity? Right now I am capturing each table in a spreadsheet capturing the PK, FK and other constraints and if the table has data etc. Can you think of anything else that would be helpful? Thanks!
Find more posts tagged with
Comments
NotHackingYou
In SSMS right click on the table and choose 'View Dependencies'. Not always 100% indicative of how the developers have linked the tables but at least shows you some of the relationships.
N2IT
Very nice this will help with my analysis. Thanks - I am always learning something new in SSMS.
jibbajabba
Drop all tables and only restore the ones people complain about
N2IT
LOL no thanks! Carl what do you use to export too? Everyone is saying paste to clipboard but I am unable to paste to MS Paint. Tried Word that was a monster fail. I am looking at some tools that connect through ODBC or JDBC but haven't found anything I am in love with yet. Thoughts?
NightShade03
First thought here is to install something like
NewRelic
on your application servers. The insights and crazy metrics you'll get out of such a tool is amazing, plus they can pinpoint not only application problems, but exact lines of the code that querying the DB. You can use this to tune the parameters of the DB, storage engines, application servers, etc. Once you've addressed the performance aspect then I would turn to tables.
Tables you can do a **** of the schema only (it's different for every DB, but the concept is the same). Once you **** the schemas you'll hopefully be able to see the PK and FK fields for each table and map those back to dependencies if required. Not sure how deep you want to go, but another common task here would be to script a script/expression that validates the data in each field to ensure it is conforming to the specific schema for that table/field. This is more of a data validation exercise then DB tuning....but they go hand in hand.
Finally I'd put some monitoring software (you can also use NewRelic for this) in order to track some metrics for the DB server/cluster itself. Are there performance issues? Old hardware that isn't keeping? You get the idea. By the end of this you should have a well oiled machine with some baselines metrics to help you measure and maintain the DB tables and server better for the long term.
N2IT
I found something called SchemaCrawler it seems to work with GraphViz. Night I'll review your post at home tonight have another project meeting to head too. Thanks for replying!
N2IT
Just a quick question - We have a ton of "backup tables" in our prod schema. I see that their only dependence is to themselves, anything else I should be aware of? I am going to post my findings to our DBA for removal so it will get another set of eyes. I am trying to make life easier on the DBA. Isn't that what a Business analyst supposed to do?
Quick Links
All Categories
Recent Posts
Activity
Unanswered
Groups
Best Of