Friday, March 30, 2012
Having trouble converting to datetime in SQL import
Now that we've moved the tables to SQL Server 2000, we're having problems with the data. We wanted to set up DTS packages for each file to just put them directly into SQL Server. This works for pretty much everything except for the dates.
The way the files are set up, they're comma delimited files with quotes around the text and nothing around the dates. The dates don't have any delimiters; they're just listed like 13012006. Every time we try to import these files into SQL, it gripes about the datatypes; we're trying to put the dates into datetime fields but SQL thinks they're strings. Eek! If we put date delimiters (like 13/01/2006) SQL pulls them in fine, but apparently the mainframe lacks the ability to put these delimiters in by itself and still run everything else OK. The person who writes the extracts has to do it in a language called 'Focus' which I've never heard of and don't know anything about, and he says what I'm asking for can't be done. OK...so now what?
I've tried and tried to convert these strings into dates using both CAST() and CONVERT() and just can't manage to do it. I know I'm missing something really obvious here; does anyone have any tips or advice? Thanks in advance.It's a bit of a pain to find in the DTS Designer, but within the Data Pump task, if you individually define the column mappings, there is an option to select DateTime strings.
One way to do this is to:
1. Select your source and destination connections
2. Add the Data Pump task
3. Specify your source table/query
4. Specify the destination query
5. Click on the 'Transformations' tab (usually the column mappings will auto-populate)
6. Click on the column mapping where the source is the date formatted as a string.
7. Delete this column mapping (click on delete or hit the delete key)
8. Select the source column and the destination column and click on 'New'
9. Choose the third item from the pop-up list (DateTime string)
10. You will then have to define the properties of the source and destination formats.
I've worked with this before and it worked pretty well.
Alternatively, you could proceed through item 8 and then (in item 9) choose ActiveX script. This will allow you to define more custom formats and insert additional error checking and/or handling.
Regards,
hmscott|||It's a bit of a pain to find in the DTS Designer, but within the Data Pump task, if you individually define the column mappings, there is an option to select DateTime strings.
One way to do this is to:
1. Select your source and destination connections
2. Add the Data Pump task
3. Specify your source table/query
4. Specify the destination query
5. Click on the 'Transformations' tab (usually the column mappings will auto-populate)
6. Click on the column mapping where the source is the date formatted as a string.
7. Delete this column mapping (click on delete or hit the delete key)
8. Select the source column and the destination column and click on 'New'
9. Choose the third item from the pop-up list (DateTime string)
10. You will then have to define the properties of the source and destination formats.
I've worked with this before and it worked pretty well.
Alternatively, you could proceed through item 8 and then (in item 9) choose ActiveX script. This will allow you to define more custom formats and insert additional error checking and/or handling.
Regards,
hmscott
Thanks for the advice; I'll try your first suggestion, since I can't write ActiveX script. I can usually manage SQL and I can almost always figure it out on my own, but this one stumped me. I'm not a certified DBA or anything, I'm still learning. :)|||It's a bit of a pain to find in the DTS Designer, but within the Data Pump task, if you individually define the column mappings, there is an option to select DateTime strings.
One way to do this is to:
1. Select your source and destination connections
2. Add the Data Pump task
3. Specify your source table/query
4. Specify the destination query
5. Click on the 'Transformations' tab (usually the column mappings will auto-populate)
6. Click on the column mapping where the source is the date formatted as a string.
7. Delete this column mapping (click on delete or hit the delete key)
8. Select the source column and the destination column and click on 'New'
9. Choose the third item from the pop-up list (DateTime string)
10. You will then have to define the properties of the source and destination formats.
I've worked with this before and it worked pretty well.
Alternatively, you could proceed through item 8 and then (in item 9) choose ActiveX script. This will allow you to define more custom formats and insert additional error checking and/or handling.
Regards,
hmscott
We tried using datetime strings and it just refused to work; we had to write ActiveX scripts n the end using DateSerial in VBScript, which was tough for us because neither one of us really knows VBScript, I'm a VB noob and my colleague knows VBA but not VBScript; I know they're basically the same but he doesn't. :) But we did eventually get it to work! So yay!
Having problems with SQLXML bulk import using diffgram
server. When I start importing, most of the files that I have successfully
import, but some of them error out with the message: "SQLOLEDB Error
Description: Empty update, no updatable rows found Transaction aborted".
I've looked at the xml diffgram and there are definitely modified and
inserted rows in it. Has anyone run across this? I'd be glad to post the
diffgram here if anyone would like to see it.
Thanks,
Rob Baber
Do you use the diffgram or updategram format?
Sending a repro would help.
Thanks
Michael
"Rob Baber" <Rob Baber@.discussions.microsoft.com> wrote in message
news:E10640A7-BCE5-42AE-B842-C9956E067812@.microsoft.com...
>I have a program that writes out xml files that I use to import into SQL
> server. When I start importing, most of the files that I have
> successfully
> import, but some of them error out with the message: "SQLOLEDB Error
> Description: Empty update, no updatable rows found Transaction aborted".
> I've looked at the xml diffgram and there are definitely modified and
> inserted rows in it. Has anyone run across this? I'd be glad to post the
> diffgram here if anyone would like to see it.
> Thanks,
> Rob Baber
|||Could you please check the generated T-SQL by the SqlXml by using the
profiler? You are most likely using a diffgram behaving different than your
expectation or hitting a bug in SqlXml.
If you check out the sql and it seems like a wrongly generated one, could
you please post your diffgram and t-sql so that we can investigate if it is
a bug.
Thanks
Bertan ARI
This posting is provided "AS IS" with no warranties, and confers no rights.
"Rob Baber" <Rob Baber@.discussions.microsoft.com> wrote in message
news:E10640A7-BCE5-42AE-B842-C9956E067812@.microsoft.com...
>I have a program that writes out xml files that I use to import into SQL
> server. When I start importing, most of the files that I have
> successfully
> import, but some of them error out with the message: "SQLOLEDB Error
> Description: Empty update, no updatable rows found Transaction aborted".
> I've looked at the xml diffgram and there are definitely modified and
> inserted rows in it. Has anyone run across this? I'd be glad to post the
> diffgram here if anyone would like to see it.
> Thanks,
> Rob Baber
sql
Having problems with SQLXML bulk import using diffgram
server. When I start importing, most of the files that I have successfully
import, but some of them error out with the message: "SQLOLEDB Error
Description: Empty update, no updatable rows found Transaction aborted".
I've looked at the xml diffgram and there are definitely modified and
inserted rows in it. Has anyone run across this? I'd be glad to post the
diffgram here if anyone would like to see it.
Thanks,
Rob BaberDo you use the diffgram or updategram format?
Sending a repro would help.
Thanks
Michael
"Rob Baber" <Rob Baber@.discussions.microsoft.com> wrote in message
news:E10640A7-BCE5-42AE-B842-C9956E067812@.microsoft.com...
>I have a program that writes out xml files that I use to import into SQL
> server. When I start importing, most of the files that I have
> successfully
> import, but some of them error out with the message: "SQLOLEDB Error
> Description: Empty update, no updatable rows found Transaction aborted".
> I've looked at the xml diffgram and there are definitely modified and
> inserted rows in it. Has anyone run across this? I'd be glad to post the
> diffgram here if anyone would like to see it.
> Thanks,
> Rob Baber|||Could you please check the generated T-SQL by the SqlXml by using the
profiler? You are most likely using a diffgram behaving different than your
expectation or hitting a bug in SqlXml.
If you check out the sql and it seems like a wrongly generated one, could
you please post your diffgram and t-sql so that we can investigate if it is
a bug.
Thanks
Bertan ARI
This posting is provided "AS IS" with no warranties, and confers no rights.
"Rob Baber" <Rob Baber@.discussions.microsoft.com> wrote in message
news:E10640A7-BCE5-42AE-B842-C9956E067812@.microsoft.com...
>I have a program that writes out xml files that I use to import into SQL
> server. When I start importing, most of the files that I have
> successfully
> import, but some of them error out with the message: "SQLOLEDB Error
> Description: Empty update, no updatable rows found Transaction aborted".
> I've looked at the xml diffgram and there are definitely modified and
> inserted rows in it. Has anyone run across this? I'd be glad to post the
> diffgram here if anyone would like to see it.
> Thanks,
> Rob Baber
Monday, March 26, 2012
Havent been able to execute a package on the server
Hello, I created a package on my machine, it deletes some files, then delete some rows, then copy files from a destination to a source and then process all those files and insert rows in a table, really simple.
When I click execute in VS 2005 it executes perfectly. (it took about 45 seconds because there are many files)
I connected to integration services and imported the package then I did the two following things.
1-Right click run package and it executes normally but it took less than 1 second, and when I saw the destination folder there were no files in there so it didnt do anything)
2. I created a job and on the first step I put to execute that package. I then executed the job and the same thing happens, it executed without errors but it took less than 1 send and when I saw the destination folder there were no files in there so it didnt do anything).
I noticed that the Integration services project has a property for creating a deployment utility, I changed this property to true but I dont know how to make the deployment utility.
Maybe the problem was that when I Imported the package,, the package was on another machine on my LAN?
Can the package on the server even see the files? The directories have to be the same, and the user account for the SQL Server service must have rights to that directory as well.
When you execute it on your machine, it's using your user account and currently accessible folders. When the package is promoted to the server, you will be using a different account.|||enable package logging to get more details of the execution. The package may be failing...|||
Hello, I finally could upload the package, and from the management studio interface I ran the package and it worked perfectly.
When I created a job, with one step only to execute that package, the job fails.
When I go to history it doesnt give me any details of what failed on the package or in the job
Date 24/01/2007 12:30:28
Log Job History (Carga datos ACH)
Step ID 1
Server ATLANTE\SQL2005
Job Name Carga datos ACH
Step Name Carga de datos de ach
Duration 00:00:02
Sql Severity 0
Sql Message ID 0
Operator Emailed
Operator Net sent
Operator Paged
Retries Attempted 0
Message
Executed as user: ATLANTE\SYSTEM. The package execution failed. The step failed.
Maybe is the user that it tried to execute the package as?
|||
Try using SQL Server Agent CmdExec job step sub-system so that Dtexec can be employed. This will cause more detailed error messages to be stored in the job history. It is also useful to implement the job step log.|||Luis Esteban Valencia Mu?oz wrote:
Hello, I finally could upload the package, and from the management studio interface I ran the package and it worked perfectly.
When I created a job, with one step only to execute that package, the job fails.
When I go to history it doesnt give me any details of what failed on the package or in the job
Hi Luis,
This could be due to the ProtectionLevel setting for the individual packages - that's my guess.
By default, these are set to EncryptSensitiveWithUserKey. This means as long as you personally execute the packages, your credentials are picked up and the packages execute in your security context. This is true even if you're connected to a remote machine, so long as you're using the same AD credentials you used when you built the packages. Does this make sense?
When the job you created executes, it runs under the SQL Agent Service logon credentials.
My understanding of the "Sensitive" in EncryptSensitiveWithUserKey may be flawed, but I cannot find a way to tell my SSIS package "hey, this isn't sensitive so don't encrypt it." Although this sometimes gets in the way I like this feature because it keeps me from doing something I would likely later regret. Anyway, my point is the Sensitive label is applied to connection strings and I cannot find a way to un-apply it (and I'm cool with that).
One of the first things an SSIS package tries to do (after validation) is load up configuration file data. This uses a connection, which (you guessed it) uses your first connection string. Since yours are encrypted with your own personal SID on the domain and this is different from the SID on the account running the SQL Agent Service, the job-executed packages cannot connect to the configuration files to decrypt them.
There are a couple proper long-term solutions but the one that is simplest to implement is to use the EncryptSensitiveWithPassword Package ProtectionLevel option and supply a good strong password. You will need to supply the password when you set up the job step as well. But this should allow the package to run without needing your security credentials.
Note: You will also need this password to open the packages in BIDS (or Visual Studio) from now on... there's no free lunch.
Hope this helps,
Andy
Friday, March 23, 2012
have a problem
who knows how to fix this
> MMC cannot open the file C:\Program Files\Microsoft SQL
Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
This may be because the file doesnt exist, is not an MMC console, or
was created by a later version of MMC. This may also be because
Thx
Vegita
---
Posted via http://www.mcse.ms
---
View this thread: http://www.mcse.ms/message412335.htmlHi Vegita.
The error indicates one of a few things is the problem - have you checked
each of them? Can you offer us anything more to work with, such as:
(a) Does the file actually exist?
(b) Have you upgraded MMC.exe? (added new service packs lately etc)
(c) Have you uninstalled the SQL client tools perhaps?
A simple catch-all approach to solving the problem might be to simply
re-install the SQL Server Client Tools (not the server) which will probably
get you out of jail..
Regards,
Greg Linwood
SQL Server MVP
"Vegita" <Vegita.11xyqq@.mail.mcse.ms> wrote in message
news:Vegita.11xyqq@.mail.mcse.ms...
> Who knows how to fix this..
> who knows how to fix this
> > MMC cannot open the file C:\Program Files\Microsoft SQL
> Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
> This may be because the file doesnt exist, is not an MMC console, or
> was created by a later version of MMC. This may also be because
> Thx
>
> Vegita
> ---
> Posted via http://www.mcse.ms
> ---
> View this thread: http://www.mcse.ms/message412335.html
>
have a problem
who knows how to fix this
> MMC cannot open the file C:\Program Files\Microsoft SQL
Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
This may be because the file doesnt exist, is not an MMC console, or
was created by a later version of MMC. This may also be because
Thx
Vegita
---
Posted via http://www.mcse.ms
---
View this thread: http://www.mcse.ms/message412335.htmlHi Vegita.
The error indicates one of a few things is the problem - have you checked
each of them? Can you offer us anything more to work with, such as:
(a) Does the file actually exist?
(b) Have you upgraded MMC.exe? (added new service packs lately etc)
(c) Have you uninstalled the SQL client tools perhaps?
A simple catch-all approach to solving the problem might be to simply
re-install the SQL Server Client Tools (not the server) which will probably
get you out of jail..
Regards,
Greg Linwood
SQL Server MVP
"Vegita" <Vegita.11xyqq@.mail.mcse.ms> wrote in message
news:Vegita.11xyqq@.mail.mcse.ms...
> Who knows how to fix this..
> who knows how to fix this
> Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
> This may be because the file doesnt exist, is not an MMC console, or
> was created by a later version of MMC. This may also be because
> Thx
>
> Vegita
> ---
> Posted via http://www.mcse.ms
> ---
> View this thread: http://www.mcse.ms/message412335.html
>
Have a copy of a database
Is any way of making a copy of a database (copying the MDF and LDF files)
without detaching the database?
Thanks
Yuelin
Yes. Use the BACKUP DATABASE command. This will after you have performed RESTORE achieve the same
effect as copying the physical files.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files) without detaching the
> database?
> Thanks
> Yuelin
>
|||You may use DTS to transfer data across. from em, right click the database,
import or export.
my 2c
Yifei
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files)
> without detaching the database?
> Thanks
> Yuelin
>
sql
Have a copy of a database
Is any way of making a copy of a database (copying the MDF and LDF files)
without detaching the database?
Thanks
YuelinYes. Use the BACKUP DATABASE command. This will after you have performed RES
TORE achieve the same
effect as copying the physical files.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files)
without detaching the
> database?
> Thanks
> Yuelin
>|||You may use DTS to transfer data across. from em, right click the database,
import or export.
my 2c
Yifei
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files)
> without detaching the database?
> Thanks
> Yuelin
>
Have a copy of a database
Is any way of making a copy of a database (copying the MDF and LDF files)
without detaching the database?
Thanks
YuelinYes. Use the BACKUP DATABASE command. This will after you have performed RESTORE achieve the same
effect as copying the physical files.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files) without detaching the
> database?
> Thanks
> Yuelin
>|||You may use DTS to transfer data across. from em, right click the database,
import or export.
my 2c
Yifei
"Yuelin Liang" <yuelinliang@.hotmail.com> wrote in message
news:O1MSAHZHFHA.3332@.TK2MSFTNGP15.phx.gbl...
> Hi
> Is any way of making a copy of a database (copying the MDF and LDF files)
> without detaching the database?
> Thanks
> Yuelin
>
Wednesday, March 21, 2012
Hash files in SQL Server
I am trying to test a software called NGS SQL Crack which basically
cracks the passwords for all the userids on your server. For the test
it is asking me to load a "hash file". Have any ideas what they are
referring to?
Thanks,
Raziq.
*** Sent via Developersdex http://www.examnotes.net ***
Don't just participate in USENET...get rewarded for it!It's not a SQL Server thing. I would read the docs for the software product
you're using. I suspect it explains what the hash file is...
also... I found some info simply by googling the relevent terms...
--
Brian Moran
Principal Mentor
Solid Quality Learning
SQL Server MVP
http://www.solidqualitylearning.com
"Raziq Shekha" <raziq_shekha@.anadarko.com> wrote in message news:eGjCpkw%23D
HA.3536@.TK2MSFTNGP10.phx.gbl...
Hello everyone,
I am trying to test a software called NGS SQL Crack which basically
cracks the passwords for all the userids on your server. For the test
it is asking me to load a "hash file". Have any ideas what they are
referring to?
Thanks,
Raziq.
*** Sent via Developersdex http://www.examnotes.net ***
Don't just participate in USENET...get rewarded for it!
Monday, March 19, 2012
Has anyone noticed a size difference between 2.0 and 3.0?
Now, under VS 2005 and SQL Mobile (AKA 3.0), the same application, compiled under the .NET Compact Framework 2.0, using the same CSV files generates an SDF of over 13 Megabytes.
Obviously, this was a bit unexpected and is having a rather negative impact on the HP iPaq rx1950 that the database was supposed to be running on.
The database is encrypted during the create call on both .NET CF's and SQL CE versions. I was finally able to run compact via the SQL CE tool under emulation connecting to the SDF on the desktop (running compact on the device or under emulation resulted in both running out of memory); however, it caused no change whatsoever in file size.
Has anyone seen something similar? Am I missing a new call or, since my code has not changed between versions, could I possibly be making a deprecated call?
Thanks,
Brian
Brian,
I ran this same test - generating a SQL CE/SQL mobile database using a simple CSV loader I wrote and I noticed that if the database contains an NTEXT field, the SQL Mobile database ended up much larger on device (by a factor of 2-3x) than the corresponding SQL CE database.
I sent the sample code to recreate this to the SQL Mobile team and will followup with them on this issue and post the results here.
If I removed the NTEXT fields from my tables (used nvarchar instead), the resulting SQL Mobile db was equitable in size.
-Darren
|||
Darren,
Thanks! That did it. I converted the one ntext column to an nvarchar and we're back to our original size. Since I had more than 4000 rows, I am guessing that SQL CE 3.0 was reserving a ton of additional space, just in case.
Regards,
Brian
Hi,
SQL Mobile 3.0 reserves a data page for Long Value data when data length is more than 256 bytes. NTEXT and IMAGE are long value data types. So, if you have a table with NTEXT/IMAGE column, then SQL Mobile 3.0 creates a data page for each row where the data size is more than 256 bytes. And data page size is typically 4K . If you want to update the NTEXT/IMAGE column, the operation will be very fast and it is by design. Also, whenever there is a data length exceeding 4K (not really 4K but 4K minus some control data size), another data page is allocated. Even if your data value is just 4.1K, 8K is what reserved by SQL Mobile 3.0. Best practice here would be to align your data sizes on 4K boundary.
Note: Page size may not be 4k always. It varies!
Thanks,
Laxmi Narsimha Rao ORUGANTI, MSFT, SQL Mobile, Microsoft Corporation
Has anyone noticed a size difference between 2.0 and 3.0?
Now, under VS 2005 and SQL Mobile (AKA 3.0), the same application, compiled under the .NET Compact Framework 2.0, using the same CSV files generates an SDF of over 13 Megabytes.
Obviously, this was a bit unexpected and is having a rather negative impact on the HP iPaq rx1950 that the database was supposed to be running on.
The database is encrypted during the create call on both .NET CF's and SQL CE versions. I was finally able to run compact via the SQL CE tool under emulation connecting to the SDF on the desktop (running compact on the device or under emulation resulted in both running out of memory); however, it caused no change whatsoever in file size.
Has anyone seen something similar? Am I missing a new call or, since my code has not changed between versions, could I possibly be making a deprecated call?
Thanks,
Brian
Brian,
I ran this same test - generating a SQL CE/SQL mobile database using a simple CSV loader I wrote and I noticed that if the database contains an NTEXT field, the SQL Mobile database ended up much larger on device (by a factor of 2-3x) than the corresponding SQL CE database.
I sent the sample code to recreate this to the SQL Mobile team and will followup with them on this issue and post the results here.
If I removed the NTEXT fields from my tables (used nvarchar instead), the resulting SQL Mobile db was equitable in size.
-Darren
|||
Darren,
Thanks! That did it. I converted the one ntext column to an nvarchar and we're back to our original size. Since I had more than 4000 rows, I am guessing that SQL CE 3.0 was reserving a ton of additional space, just in case.
Regards,
Brian
Hi,
SQL Mobile 3.0 reserves a data page for Long Value data when data length is more than 256 bytes. NTEXT and IMAGE are long value data types. So, if you have a table with NTEXT/IMAGE column, then SQL Mobile 3.0 creates a data page for each row where the data size is more than 256 bytes. And data page size is typically 4K . If you want to update the NTEXT/IMAGE column, the operation will be very fast and it is by design. Also, whenever there is a data length exceeding 4K (not really 4K but 4K minus some control data size), another data page is allocated. Even if your data value is just 4.1K, 8K is what reserved by SQL Mobile 3.0. Best practice here would be to align your data sizes on 4K boundary.
Note: Page size may not be 4k always. It varies!
Thanks,
Laxmi Narsimha Rao ORUGANTI, MSFT, SQL Mobile, Microsoft Corporation
Wednesday, March 7, 2012
Hard time master log files
I have a small DB, about 200MB, but my transaction log files grows regulary
over 900MB.
The only way I have been able to manage the log file is to detach the
database, delete the log file and reattach the DB.
Is there any better way to manage growing log files?
Thanks in advance
ChristianHi,
Check the recovery model for your database using the below command:-
Select DATABASEPROPERTYEX(<dbname>,Recovery )
If the recovery is Simple then the LDF should grow big intil u have some
batch operation. In that once the trasnaction is over you could
shrink the LDF file using DBCC SHRINKFILE command
If the recovery model is either FULL or BULK_LOGGED then you need to do the
Transaction log backup using BACKUP LOG command. This
will keep the LDF file not to grow.
Thanks
Hari
SQL Server MVP
"Christian Perthen" <abracadabara@.dontreplytothi
message news:e2YgKk8QFHA.3928@.TK2MSFTNGP09.phx.gbl...
> Hi,
> I have a small DB, about 200MB, but my transaction log files grows
> regulary
> over 900MB.
> The only way I have been able to manage the log file is to detach the
> database, delete the log file and reattach the DB.
> Is there any better way to manage growing log files?
> Thanks in advance
> Christian
>
Hard Drive Crashed. master & user db mdf & ldf files lost
One of our hard drives has crashed and as a result we have lost our master mdf/ldf & user db mdf/ldf files. It's not that a critical system by any means, but if the hard drive crashes and the master mdf/ldf files & user db mdf/ldf are lost, is there any way of restoring the system?
I'm thinking we probably need to re-install Enterprise Manager completely, and re-install the user db from a backup.
Any advice/suggestions would be much appreciated.
Thanks in advance,If the disks are toast, then you have to restore from your backups. When you have the hardware and O/S back to where they need to be, install SQL Server with any required service packs/patches, then restore the master database (see BOL for instructions), then restore the user databases.
Hard drive crashed but MDF/LDF on drive D:
The c: drive crashed;
the MDF and LDF files were on drive D:
When a new hard drive is installed with the new OS and
SQL 7.0 installed on it...is there a way to "attach" to
the MDF and LDF files? or are they lost?
Thanks,
Don
with SQL7 and/or SQL 2000
you can try sp_attach_db
PS. usually better to perform sp_detach_db but you had no choice in this
instance :-)
hope this works.
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>
|||sp_attach_db is the first choice but if you have issues this may help:
http://www.sqlservercentral.com/scri...p?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://support.microsoft.com/?id=221465 Using WITH MOVE in a
Restore
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after a
Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
Andrew J. Kelly SQL MVP
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>
Hard drive crashed but MDF/LDF on drive D:
The c: drive crashed;
the MDF and LDF files were on drive D:
When a new hard drive is installed with the new OS and
SQL 7.0 installed on it...is there a way to "attach" to
the MDF and LDF files? or are they lost?
Thanks,
Donwith SQL7 and/or SQL 2000
you can try sp_attach_db
PS. usually better to perform sp_detach_db but you had no choice in this
instance :-)
hope this works.
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>|||sp_attach_db is the first choice but if you have issues this may help:
http://www.sqlservercentral.com/scr...sp?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://support.microsoft.com/?id=221465 Using WITH MOVE in a
Restore
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after a
Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
Andrew J. Kelly SQL MVP
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>
Hard drive crashed but MDF/LDF on drive D:
The c: drive crashed;
the MDF and LDF files were on drive D:
When a new hard drive is installed with the new OS and
SQL 7.0 installed on it...is there a way to "attach" to
the MDF and LDF files? or are they lost?
Thanks,
Donwith SQL7 and/or SQL 2000
you can try sp_attach_db
PS. usually better to perform sp_detach_db but you had no choice in this
instance :-)
hope this works.
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>|||sp_attach_db is the first choice but if you have issues this may help:
http://www.sqlservercentral.com/scripts/scriptdetails.asp?scriptid=599
Restoring a .mdf
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://support.microsoft.com/?id=221465 Using WITH MOVE in a
Restore
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after a
Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to
users
http://www.support.microsoft.com/?id=168001 User Logon and/or Permission
Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
http://www.support.microsoft.com/?id=307775 Disaster Recovery Articles
for SQL Server
Andrew J. Kelly SQL MVP
"Don" <anonymous@.discussions.microsoft.com> wrote in message
news:086001c4f832$20813b50$a601280a@.phx.gbl...
> SQL 7.0/Server 2000
> The c: drive crashed;
> the MDF and LDF files were on drive D:
> When a new hard drive is installed with the new OS and
> SQL 7.0 installed on it...is there a way to "attach" to
> the MDF and LDF files? or are they lost?
> Thanks,
> Don
>
Monday, February 27, 2012
Handling very large XML result sets
SQLXML classes and FOR XML EXPLICIT queries. What are some strategies I can
use to break up and process these large result sets? The overhead of issuing
multiple queries by breaking them up via WHERE clause filters isn’t the way I
want to go since my queries are very large and take significant time to
process within SQL server.
I am currently experiencing out of memory exceptions on some larger result
sets (~50-60 Mbytes total XML file size). My first attempt was using
SqlXmlCommand.ExecuteXmlReader and an XmlDocument via this snippet of code:
XmlReader xr = forXMLCommand.ExecuteXmlReader();
XmlDocument xd = new XmlDocument();
xd.Load(xr);
This throws a System.OutOfMemoryException on the call to ExecuteXmlReader
when the result set gets very large.
I also tried using SqlXmlCommand.ExecuteStream thinking I could read a
buffer of chars at a time to process these large result sets but this also
resulted in a System.OutOfMemoryException on the call to ExecuteStream:
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.Peek() >= 0)
{
c = new char[10000];
intnumRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
I have tried running my application on two different systems one with 1G of
main memory and the other a Win2K3 server with 8G of main memory. Both
systems seem to run out of memory at the same 50-60 Mb limit) Are there any
..NET memory settings I can tweak to give my .NET application more memory?
Thanks for your suggestions and ideas,
Scott
The XmlReader is a streaming interface which should not run out of memory
via the SqlXmlCommand.ExecuteStream method.
Loading into an XmlDocument however will cache the entire document into
memory.
Can you remove the following two lines from your repro and see if you are
still having the problem:
XmlDocument xd = new XmlDocument();
xd.Load(xr);
Thanks -
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad
|||Andrew,
That was exactly my thought as well, but ExecuteStream is throwing an
OutOfMemoryException. I am NOTcalling XmlDocument .Load in the code that
uses ExecuteStream.
Here is my full method I am using:
private void ExecuteSQLXMLCommandExecuteStream()
{
try
{
SqlXmlCommandforXMLCommand = new SqlXmlCommand("Provider=SQLOLEDB;DATA
SOURCE=Gibraltar;Initial Catalog=RDCModel;User ID=sa;Password=XXXX");
forXMLCommand.CommandType = SqlXmlCommandType.Sql;
StreamReadersr1 = new StreamReader(mQueryFileName);
stringquery = sr1.ReadToEnd();
sr1.Close();
query = query.Replace("\r\n", " ");
query = query.Replace("\t", " ");
forXMLCommand.CommandText = query;
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.Peek() >= 0)
{
c = new char[10000];
intnumRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
sw3.Close();
}
catch (SqlXmlException ex)
{
ex.ErrorStream.Position = 0;
string sqlErrorString;
sqlErrorString = new StreamReader(ex.ErrorStream).ReadToEnd();
Console.WriteLine(sqlErrorString);
RDCUtilities.WriteToLog(sqlErrorString);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
RDCUtilities.WriteToLog(ex.Message);
}
""Andrew Conrad"" wrote:
> The XmlReader is a streaming interface which should not run out of memory
> via the SqlXmlCommand.ExecuteStream method.
> Loading into an XmlDocument however will cache the entire document into
> memory.
> Can you remove the following two lines from your repro and see if you are
> still having the problem:
> XmlDocument xd = new XmlDocument();
> xd.Load(xr);
> Thanks -
> Andrew Conrad
> Microsoft Corp
> http://blogs.msdn.com/aconrad
>
|||Try using SqlXmlCommand.ExecuteToStream() instead of ExecuteStream.
Because of some technical limitations with COM interop, ExecuteStream
caches results.
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad
Handling very large XML result sets
SQLXML classes and FOR XML EXPLICIT queries. What are some strategies I can
use to break up and process these large result sets? The overhead of issuin
g
multiple queries by breaking them up via WHERE clause filters isn’t the wa
y I
want to go since my queries are very large and take significant time to
process within SQL server.
I am currently experiencing out of memory exceptions on some larger result
sets (~50-60 Mbytes total XML file size). My first attempt was using
SqlXmlCommand.ExecuteXmlReader and an XmlDocument via this snippet of code:
XmlReader xr = forXMLCommand.ExecuteXmlReader();
XmlDocument xd = new XmlDocument();
xd.Load(xr);
This throws a System.OutOfMemoryException on the call to ExecuteXmlReader
when the result set gets very large.
I also tried using SqlXmlCommand.ExecuteStream thinking I could read a
buffer of chars at a time to process these large result sets but this also
resulted in a System.OutOfMemoryException on the call to ExecuteStream:
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.P
{
c = new char[10000];
int numRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
I have tried running my application on two different systems one with 1G of
main memory and the other a Win2K3 server with 8G of main memory. Both
systems seem to run out of memory at the same 50-60 Mb limit) Are there any
.NET memory settings I can tweak to give my .NET application more memory?
Thanks for your suggestions and ideas,
ScottThe XmlReader is a streaming interface which should not run out of memory
via the SqlXmlCommand.ExecuteStream method.
Loading into an XmlDocument however will cache the entire document into
memory.
Can you remove the following two lines from your repro and see if you are
still having the problem:
XmlDocument xd = new XmlDocument();
xd.Load(xr);
Thanks -
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad|||Andrew,
That was exactly my thought as well, but ExecuteStream is throwing an
OutOfMemoryException. I am NOTcalling XmlDocument .Load in the code that
uses ExecuteStream.
Here is my full method I am using:
private void ExecuteSQLXMLCommandExecuteStream()
{
try
{
SqlXmlCommand forXMLCommand = new SqlXmlCommand("Provider=SQLOLEDB;DATA
SOURCE=Gibraltar;Initial Catalog=RDCModel;User ID=sa;Password=XXXX");
forXMLCommand.CommandType = SqlXmlCommandType.Sql;
StreamReader sr1 = new StreamReader(mQueryFileName);
string query = sr1.ReadToEnd();
sr1.Close();
query = query.Replace("\r\n", " ");
query = query.Replace("\t", " ");
forXMLCommand.CommandText = query;
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.P
{
c = new char[10000];
int numRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
sw3.Close();
}
catch (SqlXmlException ex)
{
ex.ErrorStream.Position = 0;
string sqlErrorString;
sqlErrorString = new StreamReader(ex.ErrorStream).ReadToEnd();
Console.WriteLine(sqlErrorString);
RDCUtilities.WriteToLog(sqlErrorString);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
RDCUtilities.WriteToLog(ex.Message);
}
""Andrew Conrad"" wrote:
> The XmlReader is a streaming interface which should not run out of memory
> via the SqlXmlCommand.ExecuteStream method.
> Loading into an XmlDocument however will cache the entire document into
> memory.
> Can you remove the following two lines from your repro and see if you are
> still having the problem:
> XmlDocument xd = new XmlDocument();
> xd.Load(xr);
> Thanks -
> Andrew Conrad
> Microsoft Corp
> http://blogs.msdn.com/aconrad
>|||Try using SqlXmlCommand.ExecuteToStream() instead of ExecuteStream.
Because of some technical limitations with COM interop, ExecuteStream
caches results.
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad
Handling flat files that do not exist
Hello,
I have a package that contains 22 data flow tasks, one for each flat file that I need to process and import. I decided against making each import a seperate package because I am loading the package in an external application and calling it from there.
Now, everything works beautifully when all my text files are exported from a datasource beyond my control. I have an application that processes a series of files encoded using EBCDIC and I am not always gauranteed that all the flat files will be exported. (There may have not been any data for the day.)
I am looking for suggestions on how to handle files that do not exist. I have tried making a package level error handler (Script task) that checks the error code ("System::ErrorCode") and if it tells me that the file cannot be found, I return Dts.TaskResult = Dts.Results.Sucsess, but that is not working for me, the package still fails. I have also thought about progmatically disabling the tasks that do not have a corresponding flat file, but it seems like over kill.
So I guess my question is this; if the file does not exist, how can I either a) skip the task in the package, or b) quietly handle the error and move on without failing the package?
Thanks!
Lee.
I recall that I have a couple data flows in my package, each within a sequence container and connected them with completion(blue) arrows and even if the first one failed the second one continued on...
|||
That is one solution that I could use. It doesn't seem like it is the "elegant" way to handle it, but it will work.
Thanks for the suggestion.
|||well
Is dragging red arrow to some alert/send mail task will not serve purpose?
|||Scoutn wrote:
Hello,
I have a package that contains 22 data flow tasks, one for each flat file that I need to process and import. I decided against making each import a seperate package because I am loading the package in an external application and calling it from there.
Now, everything works beautifully when all my text files are exported from a datasource beyond my control. I have an application that processes a series of files encoded using EBCDIC and I am not always gauranteed that all the flat files will be exported. (There may have not been any data for the day.)
I am looking for suggestions on how to handle files that do not exist. I have tried making a package level error handler (Script task) that checks the error code ("System::ErrorCode") and if it tells me that the file cannot be found, I return Dts.TaskResult = Dts.Results.Sucsess, but that is not working for me, the package still fails. I have also thought about progmatically disabling the tasks that do not have a corresponding flat file, but it seems like over kill.
So I guess my question is this; if the file does not exist, how can I either a) skip the task in the package, or b) quietly handle the error and move on without failing the package?
Thanks!
Lee.
You can use a script task to check whether or not a file exists. This code should help:
File.Exists Method
(http://msdn2.microsoft.com/en-us/library/system.io.file.exists.aspx)
-Jamie
|||Jamie,
First allow me to say that your blog has been an invaluable resource. I have learned a lot from your articles, thank you!
Utsav, I am not looking to be notified of the error; I just want the task to merrily carry on like nothing took place. I do understand what you meant though, thank you.
Sometimes the flat file will be there and sometimes it won't be. If it isn't, I would still like the task move on to the next with a "Success" result, not just completion. That is why I was thinking of disabling the tasks that do not have corresponding flat files progmatically when I load the package from my application.
I guess I should be asking; when I set Dts.TaskResult = Dts.Results.Success, why does it not actually return success?
(I know there are a couple of ways I can get around this issue, I'm just looking for the easiest without setting the task to continue on "Completion" ;)
Thanks again,
Lee.
Check if this thread has something that can help you.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=859625&SiteID=1
Instead of disabling the task; try to use an expression in the precedence constraint in the control flow.
Rafael Salas
|||Thank you very much. I did search the forum, I guess I was just using the wrong search term.Perfect!|||
The thread (and the subsequent blog articles) solved my problem. I needed to put each task in a sequence container with their own script task checking for the files; if I used one script task to look for all the files the next task would not run because at least one file would return false.
Thanks again for everyones help!