SQL issue - how can I fix

EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
Not an SQL guy by any stretch - hence the question.



The account I used is the owner of the database, it's also an admin on the SQL machine. SQL database - 2008 R2 Express. The account's a domain account.

What can I do about this?

EDIT - just noticed that the account I have blurred out in the above photo isnt the one that is the owner of the database. Does it look like this other account is trying to write to a database it isnt the owner of?
NSX, NSX, more NSX..

Blog >> http://virtual10.com

Comments

  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■


    Does this help? Let me know what's needed for troubleshooting and I'll post it up. Big TIA!
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
  • blargoeblargoe Member Posts: 4,174 ■■■■■■■■■□
    What are you (or the application) trying to do when the errors about trying to create a duplicate unique key occur?
    IT guy since 12/00

    Recent: 11/2019 - RHCSA (RHEL 7); 2/2019 - Updated VCP to 6.5 (just a few days before VMware discontinued the re-cert policy...)
    Working on: RHCE/Ansible
    Future: Probably continued Red Hat Immersion, Possibly VCAP Design, or maybe a completely different path. Depends on job demands...
  • N2ITN2IT Inactive Imported Users Posts: 7,483 ■■■■■■■■■■
    The object (table) dbo.flashcluster already has the same data sitting in the table and when they are doing an insert it's bombing out because the data is already present and with a unique constraint on the field they are unable to write to that table. I'll look at it some more in a few. ***It looks like the application is using an API to write to the database and that data in already present in the table. I'm guessing a Java Application writing to this database. When I am troubleshooting errors like this I usually find it to be database related. They may have to go in and remove the offending record and then try to update. I would try to create this error in a test, stage, dev environment and blow the record away and see if they get it. Also are they getting this for all records? HTH and not confuse. I wonder if that UUID is already in the table, that would be a unique identifier.
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    The problem is you have a constraint saying X values in this table must be unique. You are trying to insert a duplicate record.

    You should be able to see your already existing record with this script:
    SELECT * FROM dbo.flashcluster WHERE
    objname = 'Bourke Street'
    AND objUuid LIKE 'F06706c5-%'

    You might be able to fix it by deleting that record but that of course could have a lot of other implications. If you right click on that key and choose 'script key as' then send me the the script via PM and/or send more of the exception I may be able to offer more help.
    When you go the extra mile, there's no traffic.
  • KenCKenC Member Posts: 131
    You have to look at the code/application that is doing that, not change the database to suit, if I am following this correctly.
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    If we can see which columns the unique constraint on we may be able to give him some more clues. Perhaps it is on three columns and the third has a DEFAULT value. Perhaps the first two columns we can see are appropriate to be repeated as long as the third column is unique.

    I agree that it's probably something that has to be fixed in the app. However, in the above scenario it's possible that he could issue an UPDATE statement to correct a value and thus allow his new (and now unique) value to be inserted. It may have other implications so of course testing would be required but it would not be off the table.
    When you go the extra mile, there's no traffic.
  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
    Thanks for the responses guys, mucho appreciated!

    @blargoe - I fired up my lab, that's all I did. I created this PernixData Flash Cluster for a demo I needed to do a few days ago. Everything was good then. I shut the lab down. Today, I thought I'd do some more work on the demo and have it ready for the execs to looks at. Started up all my VM's, and went to the PernixData tab in the vSphere Client and I dont see the Flash Cluster I created a few days ago. I restarted the lab, and the Flash Cluster wouldnt show up. Weird thing is, if I tried to create another cluster with the same name again it throws an error saying the cluster already exists. But it doesnt load up in the vSphere client. So I looked at the logs of the Flash Cluster Mgmt server and I saw the few lines I posted in the first screenshot.

    @N2 - Please read the above for some more info. Yes your right, the app, Flash Virtualization Platform or FVP, is trying to do something. It appears to be trying to overwrite some entry in the DB.

    Guys, please note I'm not doing anything - just started up all my VM's.

    @Carl - I dont know why the application, FVP, is trying to insert a duplicate record. Like I said in the opening post - the domain account I've blurred out isnt the one that is the owner of the database. Does it look like FVP is trying to use a different account than what I installed it with? I'll run that script this evening. I dont know what may happen if I were to delete that record, dont really want to create the whole thing all over again. I'll send the script via a PM this evening. What other testing can I do?

    Thanks heaps guys!
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
  • lsud00dlsud00d Member Posts: 1,571
    I agree with N2IT, that UUID probably already exists--primary keys must be unique.

    Also what CarlSaiyed suggests with the SELECT statement will help you identify if that UUID indeed already exists.
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    Based on what you posted, I don't see a permission issue, I see a data issue. Consider this example

    Here's my data

    Name Number
    Bob 1
    Chuck 2
    Phil 3

    When I enforce a unique constraint on name and number, a row 'Bob', 1 may not be inserted because it will not be unique. However, I CAN insert 'Bob', 4 and likewise I can insert 'Chuck', 1. Your application is telling you that you are trying to insert 'Bob', 1 but it has been told not to allow duplicates.

    There are several ways you can get around this.
    1. Provided your application must inject 'Bob', 1 to continue and there's no dependency on the existing 'Bob', 1 record you could simply remove this record. I don't recommend this because you have no idea what else is tied to that record and based on other constraints that may exist you may or may not be even able to delete it.

    2. You can disable the unique constraint to allow two entries for 'Bob', 1. I don't recommend this because the unique constraint is probably there for a reason.

    For 1 and 2, you can take some easy precautions here like a database backup or even just table backups (SELECT * INTO flashcluster_backup FROM Flashcluster will create a copy of your table. You could then truncate flashcluster and put your data back if you wanted to, preserving the keys by enabling identity insert)

    3. You can figure out why the application is trying to insert 'Bob', 1' when it already exists. Perhaps the user account cannot read the existing table for lack of permissions and as such assumes it needs to inject the record. Perhaps another reference is missing in your database somewhere else. Perhaps there's a configuration that prompts this injection. This is, of course, the recommended approach.

    The answers to the other questions (scripts) can help point you in the right direction for #3. I can provide you with guidance on options 1 and 2 if you wish but without access to the DBMS it's impossible to tell you from here what the ramifications could be.
    When you go the extra mile, there's no traffic.
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    Here's some sample SQL that you could use to try idea #1.

    --Make a backup of this table
    SELECT * INTO dbo.flashcluster_backup FROM dbo.flashcluster

    BEGIN TRAN
    DELETE dbo.flashcluster WHERE objname = 'Bourke Street' AND objUuid = '<paste in ID, keep quotes>'
    ROLLBACK TRAN
    --This should show one row affected. If it does, then run it again below to save your work

    BEGIN TRAN
    DELETE dbo.flashcluster WHERE objname = 'Bourke Street' AND objUuid = '<paste in ID, keep quotes>'
    COMMIT TRAN

    --Now run your application and see if things work. If not, proceed below

    --Delete everything in flashcluster and put it back the way it was
    TRUNCATE TABLE dbo.flashcluster
    GO
    SET IDENTITY_INSERT dbo.flashcluster ON
    INSERT INTO dbo.flashcluster (objname, objUuid, <list remaining columns, separated by a comma>)
    SELECT * FROM dbo.flashcluster_backup
    SET IDENTITY_INSERT dbo.flashcluster OFF


    This is a very simple example that does not take into account any other relationships and how a delete could cascade there. Tread very carefully if you decide to go this route.
    When you go the extra mile, there's no traffic.
  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
    I'll give this a go tonight Carl. Thank you for taking the time to write this out, much appreciated.
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    Honestly I would probably take a full database backup and restore it (to ensure it is good) before trying the script in post 11 unless you are comfortable rebuilding the entire system.

    The other script in post 5 won't hurt anything, and neither will scripting the constraint (this just shows you the SQL it would take to make it again).
    When you go the extra mile, there's no traffic.
  • N2ITN2IT Inactive Imported Users Posts: 7,483 ■■■■■■■■■■
    Inquiring minds want to know.
  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
    I'll absolutely get back to you guys! Swamped with tons of work, had no energy to fire up the lab. I'll definitely have a go at it this weekend and keep you good Sir's posted!
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
    Update - I took a backup of the database and proceeded with a shotgun approach to this - deleted the Unique Constraint. What this has done is it's allowed me to create a cluster with the same name again but the previously created cluster still doesnt show up in vCenter. Something weird going on here, this time the Unique Constraint wasnt created (or is it created only when a new DB is created?). I've also logged a ticket with the vendor to check out their application. Like you guys said, there's likely something within the application that's causing this behaviour. Please note this is a beta version, so an error here and there is likely. Thanks for the replies guys, you've improved my SQL skills.

    I'll post back when the vendor's came back to me.
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
  • NotHackingYouNotHackingYou Member Posts: 1,460 ■■■■■■■■□□
    Thanks for the update. The unique constraint is typically created just after the table is created. You could try to re-create it if you wanted, but the creation would fail if all of the rows are not unique (as defined by the constraint)
    When you go the extra mile, there's no traffic.
  • EssendonEssendon Member Posts: 4,546 ■■■■■■■■■■
    Another update - it was a Java error in the end, as you guys suggested. It needed a particular version of the JRE. Thanks again!
    NSX, NSX, more NSX..

    Blog >> http://virtual10.com
Sign In or Register to comment.