Made a Blunder

coffeekingcoffeeking Member Posts: 305 ■■■■□□□□□□
Over the weekend I almost unleashed a DOS attack against one of my own monitoring boxes. The database server I started to monitor hosts more than 20 DBs and I couldn't exclude all the unecessary traffic even after doing everything I could. The box got to the point where the local database reached 100%. I had to take some action and purge some data from the DB, which I did but deleted some cirtical data and spent the last day extremely stressed about what had happened. I took the responsibility of my action and admitted that it was a mistake but I had taken the action to keep the box online. Luckily the managers are coperative and understanding and everything is fine. But, for a day I had all the scary thoughts one could possibly have having made such a mistake.

You guys must have had done something similar in your career. would you like to share?
«1

Comments

  • forkvoidforkvoid Member Posts: 317
    1) Reimaged someone's machine without taking a backup because they told me they didn't have anything on there. They misunderstood my question. Lesson learned: never trust the client.

    2) Bought the wrong product, costing us ~$650. Product is still on my desk, as it can't be used nor returned due to Dell Premier policy. Lesson learned: never trust the vendor's website.

    3) Took down a backbone router and switch in the middle of the business day. Results are obvious. Lesson learned: no cable-fu in the rack during business hours.

    4) Installed three main UTP ethernet lines from the MDF to the demarc, accidentally running them through a power generation room, resulting in signal so bad that it was nearly unusable. Lesson learned: plan your cable route beforehand, and if you're not a cabler, consult one first.

    5) Became aware that a couple of my friends had found a wide-open backdoor to the custom-coded grading system while they were students and I an intern for the school district. Consulted a teacher on the ethics of how to proceed getting it fixed but no one getting in trouble. He told the district anyways, resulting in me being (temporarily) fired and the students being barred from computer usage at school for two years. Lesson learned: while you may have ethics concerns, those you consult will then have their own, and your privacy may not be one of them.

    One from each of my main jobs.
    The beginning of knowledge is understanding how little you actually know.
  • coffeekingcoffeeking Member Posts: 305 ■■■■□□□□□□
    forkvoid wrote: »
    1) Reimaged someone's machine without taking a backup because they told me they didn't have anything on there. They misunderstood my question. Lesson learned: never trust the client.

    2) Bought the wrong product, costing us ~$650. Product is still on my desk, as it can't be used nor returned due to Dell Premier policy. Lesson learned: never trust the vendor's website.

    3) Took down a backbone router and switch in the middle of the business day. Results are obvious. Lesson learned: no cable-fu in the rack during business hours.

    4) Installed three main UTP ethernet lines from the MDF to the demarc, accidentally running them through a power generation room, resulting in signal so bad that it was nearly unusable. Lesson learned: plan your cable route beforehand, and if you're not a cabler, consult one first.

    5) Became aware that a couple of my friends had found a wide-open backdoor to the custom-coded grading system while they were students and I an intern for the school district. Consulted a teacher on the ethics of how to proceed getting it fixed but no one getting in trouble. He told the district anyways, resulting in me being (temporarily) fired and the students being barred from computer usage at school for two years. Lesson learned: while you may have ethics concerns, those you consult will then have their own, and your privacy may not be one of them.

    One from each of my main jobs.

    Phew....makes me feel better. I guess I am not the only one then, it would be interesting to see what others have to share.
  • Michael.J.PalmerMichael.J.Palmer Member Posts: 407 ■■■□□□□□□□
    Sounds like you were pretty lucky that your bosses are understanding, at most places if you lost critical information like that it would be a resume generating event.

    We all learn from mistakes like these, I'm sure you learned something, icon_razz.gif.
    -Michael Palmer
    WGU Networks BS in IT - Design & Managment (2nd Term)
    Transfer: BAC1,BBC1,CLC1,LAE1,INC1,LAT1,AXV1,TTV1,LUT1,INT1,SSC1,SST1,TNV1,QLT1,ABV1,AHV1,AIV1,BHV1,BIV1
    Required Courses: EWB2, WFV1, BOV1, ORC1, LET1, GAC1, HHT1, TSV1, IWC1, IWT1, MGC1, TPV1, TWA1, CPW3.
    Key: Completed, WIP, Still to come
  • forkvoidforkvoid Member Posts: 317
    Avoiding things like these is a great argument for having change control procedures. Everything is planned before it's ever done and looked over and approved by at least one person, usually more. Included in change control is your back-out plan if things go wrong. Every single mistake mentioned here would have been solved by following proper policy.

    Of course, you don't realize this until after you've made a good number of mistakes.
    The beginning of knowledge is understanding how little you actually know.
  • rsuttonrsutton Member Posts: 1,029 ■■■■■□□□□□
    We all have our stories. A few I can think of:

    RDP'ed in to the company fileshare server and then doing a shutdown thinking it was my local machine

    Entering the wrong sytnax in to a firewall and bringing down the interwebs during business hours.

    Racking a server and unplugging another server in the process. Of course during business hours.

    Playing a practical joke on someone by changing their desktop wallpaper to an inappropriate picture and then having the CIO walk by and not finding it very funny. Followed by a stern talking to from my manager.

    Closing an RDP session that someone accidently left open, actually they were writing a script that they hadn't yet saved.

    I could go on...
  • steve13adsteve13ad Member Posts: 398 ■■■■□□□□□□
    Who hasn't accidentally unplugged switch/router/server while removing another piece of equipment.

    Plus when you mess up again, you can always say "Hey atleast I didn't delete any DBs this time!"
  • stuh84stuh84 Member Posts: 503
    Back before I had really got into the Cisco side properly, I was on a router, trying to discover why one site wouldn't work. I put it down to the access list configured for it, so I did the following

    no access-list 102 ip x.x.x.x x.x.x.x any

    Now my logic at the time was given that this works for most other commands (no shut, no ip address, no vlan etc), that it would work for this one line right?

    This is where I found out the hard way how access-lists work on Ciscos. Taking down 1500 individual ATMs at the time for about 2 hours without realising it is probably not the best way to learn lessons, but I'll never do it again!
    Work In Progress: CCIE R&S Written

    CCIE Progress - Hours reading - 15, hours labbing - 1
  • coffeekingcoffeeking Member Posts: 305 ■■■■□□□□□□
    forkvoid wrote: »
    Avoiding things like these is a great argument for having change control procedures. Everything is planned before it's ever done and looked over and approved by at least one person, usually more. Included in change control is your back-out plan if things go wrong. Every single mistake mentioned here would have been solved by following proper policy.

    Of course, you don't realize this until after you've made a good number of mistakes.

    Yes, this was one of the lessons learned.
  • mikedisd2mikedisd2 Member Posts: 1,096 ■■■■■□□□□□
    While ordering a printer fuser, I lined up the part description to the wrong code; ordered a 120v fuser instead of a 240v. Installed the fuser. After I left the client site, it melted all the wiring inside. $8000 printer stuffed.

    Quoted RAM prices over phone to customer who immediately placed an order. When he came to collect, I found out the price was about 3x times as much I had quoted. Received written complaint.

    Ran updates on Exchange 2007 server during business hours. Services stopped functioning. Had to reboot. Director General straight on the phone to my manager.

    Sometimes wondered why I got out of bed. icon_smile.gif
  • coffeekingcoffeeking Member Posts: 305 ■■■■□□□□□□
    mikedisd2 wrote: »
    While ordering a printer fuser, I lined up the part description to the wrong code; ordered a 120v fuser instead of a 240v. Installed the fuser. After I left the client site, it melted all the wiring inside. $8000 printer stuffed.

    This is lot more scarier than what I had done.
  • L0gicB0mb508L0gicB0mb508 Member Posts: 538
    I reinstalled the OS on a client machine after being told there was nothing important on it. Oh yeah, there turned out to be some critical files for a law firm on it.

    Shutdown an entire rack of servers while changing a UPS battery. They overloaded the other UPS and it dropped everything. This was on a military network, while there was a meeting going on. OUCH

    Restarted a server at a law firm during their lunch hour. Well it turns out a lawyer was doing something and lost about an hours worth of work.

    Dropped a removable hard drive on a concrete floor. The hard drive contained a weeks worth of backups. Yeah those were gone.

    Etc.etc.
    I bring nothing useful to the table...
  • coffeekingcoffeeking Member Posts: 305 ■■■■□□□□□□
    look like this is going to be a long thread, since everyone seem to have a story to share.
  • mikedisd2mikedisd2 Member Posts: 1,096 ■■■■■□□□□□
    coffeeking wrote: »
    This is lot more scarier than what I had done.

    Yeah that one kept on coming around and took awhile to disipate. I think it'll still be mentioned in my eulogy.
  • jojopramosjojopramos Member Posts: 415
    Installed the updated version of Trend Micro on business hours on the server that includes ERP database. I corrupted the database.... restored after 3 hours...

    accidentally restarted exchange server on business hours through RDP.... I should restart my WSUS server not this... silly me...
  • sidsanderssidsanders Member Posts: 217 ■■■□□□□□□□
    as much as change control help prevent probs, if done poorly, it just makes things sloooooooooooow that shouldnt be. took about 2 weeks to get through the "new" change control process approval to rename a service account in AD. this account wasnt being used...

    best prob i can recall was a comrade had started deleting db's he figured were no longer needed... and of course they were (mercury test director). no backups...
    GO TEAM VENTURE!!!!
  • rsuttonrsutton Member Posts: 1,029 ■■■■■□□□□□
    sidsanders wrote: »
    as much as change control help prevent probs, if done poorly, it just makes things sloooooooooooow that shouldnt be.

    This is what I have seen more of. I worked for a very, very large company that had change control for everything. The change control was so complex that it took 3 months, 2 change control "specialists", multiple managerial meetings and many many hours filling out forms to deploy... a VB script. Or, if you needed special access to a server, like RDP access, you would expect to wait 1-2 months at least because of change control. We need SLA's for our change controls! (jk)
  • gatewaygateway Member Posts: 232
    sidsanders wrote: »
    best prob i can recall was a comrade had started deleting db's he figured were no longer needed... and of course they were (mercury test director). no backups...

    Ouch!! icon_redface.gif
    Blogging my AWS studies here! http://www.itstudynotes.uk/aws-csa
  • zobo88zobo88 Member Posts: 60 ■■□□□□□□□□
    another common blunder is to issue command at system x thinking that you are on system y. this happens very often, specially if you have multiple terminals open and can turn out to be very dangerous
  • Paul BozPaul Boz Member Posts: 2,620 ■■■■■■■■□□
    Why do you guys do so much maintenance during business hours?
    CCNP | CCIP | CCDP | CCNA, CCDA
    CCNA Security | GSEC |GCFW | GCIH | GCIA
    pbosworth@gmail.com
    http://twitter.com/paul_bosworth
    Blog: http://www.infosiege.net/
  • networker050184networker050184 Mod Posts: 11,962 Mod
    Paul Boz wrote: »
    Why do you guys do so much maintenance during business hours?


    That is what I was thinking as well. IMO the only time production stuff should be touched during the day is if there is a major issue that can not wait for after hours. You are just asking for this kind of stuff if you do.
    An expert is a man who has made all the mistakes which can be made.
  • RobertKaucherRobertKaucher Member Posts: 4,299 ■■■■■■■■■■
    I reinstalled the OS on a client machine after being told there was nothing important on it. Oh yeah, there turned out to be some critical files for a law firm on it.


    I have a disclaimer I make clients sign before I do anything like this. They either agreee for me to perform a backup or they sign this stating they acknowledge and understand I will be performing a distructive operation, that a backup was recommended to them, and that they declined the backup freeing me from any responsability for lost data.

    I believe this to be critical process. When I do this at a company I work for I usually do it via email. I simply state Dear so and so or so and so's boss. You have requested that I reimage your PC and have stated there is no business critical data on the PC. As you have requested I am reimaging your PC with out performing a backup to reduce loss of productivity due to down time.
  • stuh84stuh84 Member Posts: 503
    Paul Boz wrote: »
    Why do you guys do so much maintenance during business hours?

    For us, its a lot of break/fix, and also the amount of work we get in, if we had to do it out of hours, we'd never work IN hours.

    It's all risk assessment, 99.999% of the time we never put a foot wrong, but its always the 0.001% that people remember icon_sad.gif
    Work In Progress: CCIE R&S Written

    CCIE Progress - Hours reading - 15, hours labbing - 1
  • SrSysAdminSrSysAdmin Member Posts: 259
    steve13ad wrote: »
    Who hasn't accidentally unplugged switch/router/server while removing another piece of equipment.

    Plus when you mess up again, you can always say "Hey atleast I didn't delete any DBs this time!"



    So true...I managed to unplug the main switch for the entire company a few months back! Fortunately nothing horribly bad went wrong, just a few WTFs from people and a temporary loss of network connectivity for a few minutes.


    Lesson learned...when racking/unracking heavy equipment, use the phone a friend card.
    Current Certifications:

    * B.S. in Business Management
    * Sec+ 2008
    * MCSA

    Currently Studying for:
    * 70-293 Maintaining a Server 2003 Network

    Future Plans:

    * 70-294 Planning a Server 2003 AD
    * 70-297 Designing a Server 2003 AD
    * 70-647 Server 2008
    * 70-649 MCSE to MCITP:EA
  • Hyper-MeHyper-Me Banned Posts: 2,059
    Paul Boz wrote: »
    Why do you guys do so much maintenance during business hours?

    Because the business hours consulting fees vs after hours consulting fees is an egregious gap icon_cool.gif
  • Paul BozPaul Boz Member Posts: 2,620 ■■■■■■■■□□
    Hyper-Me wrote: »
    Because the business hours consulting fees vs after hours consulting fees is an egregious gap icon_cool.gif

    Not if you're a competent IT manager. The risk of huge productivity loss due to down-time from business hours maintenance isn't worth it. Think how many man-hours were lost just from the accidents in this thread then think of how many of those man hours could have been saved using proper service windows.
    CCNP | CCIP | CCDP | CCNA, CCDA
    CCNA Security | GSEC |GCFW | GCIH | GCIA
    pbosworth@gmail.com
    http://twitter.com/paul_bosworth
    Blog: http://www.infosiege.net/
  • forkvoidforkvoid Member Posts: 317
    Paul Boz wrote: »
    Not if you're a competent IT manager. The risk of huge productivity loss due to down-time from business hours maintenance isn't worth it. Think how many man-hours were lost just from the accidents in this thread then think of how many of those man hours could have been saved using proper service windows.

    That's not usually an easy sell to clients, because they never expect the downtime. For those working in a one-client operation, then you make a valid point. When you're a VAR, it's not so simple.
    The beginning of knowledge is understanding how little you actually know.
  • phantasmphantasm Member Posts: 995
    Paul Boz wrote: »
    Why do you guys do so much maintenance during business hours?

    I work for a national ISP and when things break they need to be fixed right then, not in a maintenance window. Although we do about 99% of our maintenance and upgrades in that maintenance window, there are times though when a card goes down or some other hardware failure and it needs to be resolved immediately.
    "No man ever steps in the same river twice, for it's not the same river and he's not the same man." -Heraclitus
  • shodownshodown Member Posts: 2,271
    Chagned out a WAAS module and shut down a entire site's call center in the process. The company had 2 outgoing routers(so they thought it wouldn't cause a problem, and didn't want to pay for the extra fee's off the field technician to come back after hours.

    Changed PBR on outgoing routes and for some dumb reason I decided to do the same thing on incoming traffic(when there was no need 2). Blacked holed everything coming out. Nobody cared about the great outgoing performance when nothing could back in.
    Currently Reading

    CUCM SRND 9x/10, UCCX SRND 10x, QOS SRND, SIP Trunking Guide, anything contact center related
  • Paul BozPaul Boz Member Posts: 2,620 ■■■■■■■■□□
    phantasm wrote: »
    I work for a national ISP and when things break they need to be fixed right then, not in a maintenance window. Although we do about 99% of our maintenance and upgrades in that maintenance window, there are times though when a card goes down or some other hardware failure and it needs to be resolved immediately.

    Emergency maintenance isn't stacking new network equipment, servers, or batteries. I am very familiar with what you're talking about (I used to make the very same repairs to ATM/fiber mux equipment at my ISP job. If a line card eats it you have to fix it. However, if you're doing an upgrade do it at the right time.
    CCNP | CCIP | CCDP | CCNA, CCDA
    CCNA Security | GSEC |GCFW | GCIH | GCIA
    pbosworth@gmail.com
    http://twitter.com/paul_bosworth
    Blog: http://www.infosiege.net/
  • DevilsbaneDevilsbane Member Posts: 4,214 ■■■■■■■■□□
    I have yet to make any critical mistakes, but I have also been rarely been put in a position to make these mistakes. Oh the joys to look forward to.

    I hate to say it, but several of these mistakes have made me laugh. Almost out loud, which is not the best things while sitting at my desk.
    Decide what to be and go be it.
Sign In or Register to comment.