
Chris Wilson
@SQLChrisW
I am a SQL administrator, developer, and BI analyst that has been working with databases since 1999. I live, work, and play in the great Pacific Northwest.
قد يعجبك
Odd behavior with programatically creating SSIS packages. Creating 1300 packages we succeed on ~300 but then it stops without error. Nothing in all logs I can find from source or dest server where the packages are created. Any other logging I can turn on or where to look?#sqlhelp
#sqlhelp SQL 2019 Standard edition has a 24 core limit. For a physical server with 16 cores hyper-threaded for 32 logical cores will SQL Server use 24 or 32 cores? I can't test this myself today and keep finding mixed results from web sites I trust in Google searches.
#sqlhelp I'll try not to show bias here but "Is it a valid test to QA a datbase at a lower compat level and say it is certified for that version?" IE test code against 140,130,120, and 110 compat levels on a 2017 instance and say it's good to run against a 2012 instance?
#sqlhelp Massive CREATE VIEW stmt that unions 2000 tables takes 16 mins to complete while one with 200 tables takes 5 secs. SOS_SCHEDULER_YIELD is the only wait. CPU time is almost all of the resource used. Any idea what's causing the massive drop in performance between them?
#sqlhelp Can I mimic the effects of AutoClose so I can deallocate buffers for a single database on test servers used by multiple testers?
#sqlhelp - Anyone know the calculation for number of possible execution plans per table added?
Are exec statements run via a cursor a single transaction? I don't think so but I am seeing tlog fill up where it shouldn't. #sqlhelp
WideOrbit is looking for a solid DBA with good communication skills in Lynnwood WA or San Fran CA bit.ly/1bKYqLT #sqljobs
Blog - Brent Ozar by Brent Ozar Unlimited® brentozar.com/archive/2015/0… via @BrentOzarULTD
@AdamMachanic @SQLSoldier The problem is back today. 70 invalids first run I checked, 600 now. I am querying internal tables now.
@AdamMachanic @SQLSoldier Don't have the book unfortunately but I was able to get DAC setup and checked but no invalid numbers today either
@AdamMachanic @SQLSoldier Going to try the DAC now if there are invalid records this morning.
@AdamMachanic @SQLSoldier other columns in the dmv and found they were corresponding delete transactions 2/2
@AdamMachanic @SQLSoldier I believe so. I'm fairly new to change tracking though. I queried dm_tran_commit_table on principal using the 1/2
@AdamMachanic @SQLSoldier and has NULL for all related fields from base table. The I/U record returned is between last vers. and max vers.
@AdamMachanic @SQLSoldier The record is deleted from the base table but the last change in CHANGETABLE(CHANGES.) shows an I/U record 1/2
@AdamMachanic @SQLSoldier Creates invalid change tracking rows in CHANGETABLE function. Tracked missing transactions back to invalid nums
@AdamMachanic @SQLSoldier 200 or so per three hour snapshot are invalid/low numbers. #sqlhelp
#sqlhelp dm_tran_commit_table on snapshot has invalid,really low commit_ts nums; principal doesn't have them. Causes probs w change tracking
United States الاتجاهات
- 1. No Kings 1.21M posts
- 2. Ole Miss 13.6K posts
- 3. Georgia 65.8K posts
- 4. #UFCVancouver 12.1K posts
- 5. #GoDawgs 5,863 posts
- 6. Drew Dober N/A
- 7. Austin Hill N/A
- 8. Julian Sayin 3,110 posts
- 9. Brian Kelly 8,223 posts
- 10. Carnell Tate 6,269 posts
- 11. Gunner 5,884 posts
- 12. Texas Tech 4,833 posts
- 13. Lane Kiffin 6,261 posts
- 14. Vandy 13.3K posts
- 15. UNLV 2,339 posts
- 16. Clemson 5,515 posts
- 17. Shapen N/A
- 18. Hammond 2,445 posts
- 19. Wisconsin 20.2K posts
- 20. Pete Golding N/A
Something went wrong.
Something went wrong.