Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Friday, March 30, 2012

Having trouble installing SQL Server 2005 SP1

I'm Having problem installing SQL server 2005 SP1 on Windows 2003 server R2 SP2.

The log file talk about SQL server express but I have SQL server 2005.

03/19/2007 13:49:07.203 ================================================================================
03/19/2007 13:49:07.203 Hotfix package launched
03/19/2007 13:49:32.984 Attempting to install instance: SQL Server Native Client
03/19/2007 13:49:33.000 Attempting to install target: OMNIWEBPACS
03/19/2007 13:49:33.000 Attempting to install file: sqlncli.msi
03/19/2007 13:49:33.015 Attempting to install file: \\OMNIWEBPACS\d$\bc06cd32e1991973c97744b9c4\HotFixSqlncli\Files\sqlncli.msi
03/19/2007 13:49:33.015 Creating MSI install log file at: C:\WINDOWS\Hotfix\Redist9\Logs\Redist9_Hotfix_KB913090_sqlncli.msi.log
03/19/2007 13:49:33.015 Successfully opened registry key: Software\Policies\Microsoft\Windows\Installer
03/19/2007 13:49:33.015 Failed to read registry key: Debug
03/19/2007 13:49:34.062 MSP returned 0: The action completed successfully.
03/19/2007 13:49:34.062 Successfully opened registry key: Software\Policies\Microsoft\Windows\Installer
03/19/2007 13:49:34.062 Failed to read registry key: Debug
03/19/2007 13:49:34.062 Successfully installed file: \\OMNIWEBPACS\d$\bc06cd32e1991973c97744b9c4\HotFixSqlncli\Files\sqlncli.msi
03/19/2007 13:49:34.062 Successfully installed target: OMNIWEBPACS
03/19/2007 13:49:34.062 Successfully installed instance: SQL Server Native Client
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product Status Summary:
03/19/2007 13:49:34.062 Product: SQL Server Native Client
03/19/2007 13:49:34.062 SQL Server Native Client (RTM ) - Success
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Setup Support Files
03/19/2007 13:49:34.062 Setup Support Files (RTM ) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Database Services
03/19/2007 13:49:34.062 Database Services (SP1 2047 ENU) - NA
03/19/2007 13:49:34.062 Details: Instances of SQL Server Express cannot be updated by using this Service Pack installer. To update instances of SQL Server Express, use the SQL Server Express Service Pack installer.
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Database Services
03/19/2007 13:49:34.062 Database Services (RTM 1399 ENU) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Integration Services
03/19/2007 13:49:34.062 Integration Services (RTM 1399 ENU) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Client Components
03/19/2007 13:49:34.062 Client Components (RTM 1399 ENU) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: MSXML 6.0 Parser
03/19/2007 13:49:34.062 MSXML 6.0 Parser (RTM ) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: SQLXML4
03/19/2007 13:49:34.062 SQLXML4 (RTM ) - Not Applied
03/19/2007 13:49:34.062
03/19/2007 13:49:34.062 Product: Backward Compatibility
03/19/2007 13:49:34.078 Backward Compatibility (RTM ) - Not Applied
03/19/2007 13:49:34.078
03/19/2007 13:49:34.078 Product: Microsoft SQL Server VSS Writer
03/19/2007 13:49:34.078 Microsoft SQL Server VSS Writer (RTM ) - Not Applied
03/19/2007 13:49:34.078

any Idea?

It means that you take the wrong package to apply. You will need to get the following packages for ENU depending on the processor archiectures.

SQLServer2005SP1-KB913090-x86-ENU.exe

SQLServer2005SP1-KB913090-x64-ENU.exe

SQLServer2005SP1-KB913090-ia64-ENU.exe

Having problems with variables

I have a Execute Process Task that pretty much executes a batch file that downloads a file that has a dynamic file name (with datetime stamp). Now I would like to load this file to a Flat File Source task in the Data Flow Task section automatically. So, creating a file manager or something on the fly. Is something like this possible with SSIS? Or am I simply hitting the wall here?

Thank you.

Why can you not just drop a data-flow into your package and make sure it executes after the Execute process task?

-Jamie

Wednesday, March 28, 2012

Having problems importing MDF file (no LDF)

We have a server where the LDF file was deleted by someone. The
software vendor (Platypus) actually recommended doing this. I'm now
stuck with trying to get this DB back up and running.
I tried the methods listed here, but they were not successful. The
failure occurs at the point of rebuilding the LDF file. It ran for
over 15 hours with no effect.
Any ideas here?
Thanks,
DougYou didn't actually mention the "methods listed here", but try
sp_attach_single_file_db.
<nerd1701@.gmail.com> wrote in message
news:1191784420.394668.322780@.o80g2000hse.googlegroups.com...
> We have a server where the LDF file was deleted by someone. The
> software vendor (Platypus) actually recommended doing this. I'm now
> stuck with trying to get this DB back up and running.
> I tried the methods listed here, but they were not successful. The
> failure occurs at the point of rebuilding the LDF file. It ran for
> over 15 hours with no effect.
> Any ideas here?
> Thanks,
> Doug
>|||Only sp_attach_db creates the t-log file if it's not existed too.
--
Ekrem Önsoy
<nerd1701@.gmail.com> wrote in message
news:1191784420.394668.322780@.o80g2000hse.googlegroups.com...
> We have a server where the LDF file was deleted by someone. The
> software vendor (Platypus) actually recommended doing this. I'm now
> stuck with trying to get this DB back up and running.
> I tried the methods listed here, but they were not successful. The
> failure occurs at the point of rebuilding the LDF file. It ran for
> over 15 hours with no effect.
> Any ideas here?
> Thanks,
> Doug
>|||On Oct 7, 4:46 pm, "TheSQLGuru" <kgbo...@.earthlink.net> wrote:
> You didn't actually mention the "methods listed here", but try
> sp_attach_single_file_db.
> <nerd1...@.gmail.com> wrote in message
>
I tried that as well as the method listed at
http://www.sqlnewsgroups.net/group/microsoft.public.sqlserver.server/topic4869.aspx|||Your software vendor is insane. Is the database in suspect mode or is it
failing to attach period, when using the sp_attach_single_db proc?
<nerd1701@.gmail.com> wrote in message
news:1191784420.394668.322780@.o80g2000hse.googlegroups.com...
> We have a server where the LDF file was deleted by someone. The
> software vendor (Platypus) actually recommended doing this. I'm now
> stuck with trying to get this DB back up and running.
> I tried the methods listed here, but they were not successful. The
> failure occurs at the point of rebuilding the LDF file. It ran for
> over 15 hours with no effect.
> Any ideas here?
> Thanks,
> Doug
>|||On Oct 7, 7:11 pm, "Mike C#" <x...@.xyz.com> wrote:
> Your software vendor is insane. Is the database in suspect mode or is it
> failing to attach period, when using the sp_attach_single_db proc?
>
It fails when trying to do sp_attach_single_db with the error that the
log file is missing or invalid.
Doug|||Basically, you will want to restore from your backup. Even if you find some hack to get this
database into SQL Server, your data is inconsistent. Whatever hack you might find will force some
new log file to be created but the reason it complains as it is is because SQL Server need the log
file to do REDO and UNDO for transactions in order to get a consistent database. So the hack you
might find will leave you with "half performed transactions" i.e., logically inconsistent data (like
an order for a customer that doesn't exist). SQL Server also uses transactions to protect
modifications against meta-data so you risk having a physical corruption as well.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
<nerd1701@.gmail.com> wrote in message news:1191858236.871322.193030@.g4g2000hsf.googlegroups.com...
> On Oct 7, 7:11 pm, "Mike C#" <x...@.xyz.com> wrote:
>> Your software vendor is insane. Is the database in suspect mode or is it
>> failing to attach period, when using the sp_attach_single_db proc?
> It fails when trying to do sp_attach_single_db with the error that the
> log file is missing or invalid.
> Doug
>|||On Oct 8, 12:44 pm, "Tibor Karaszi"
<tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> Basically, you will want to restore from your backup. Even if you find some hack to get this
> database into SQL Server, your data is inconsistent. Whatever hack you might find will force some
> new log file to be created but the reason it complains as it is is because SQL Server need the log
> file to do REDO and UNDO for transactions in order to get a consistent database. So the hack you
> might find will leave you with "half performed transactions" i.e., logically inconsistent data (like
> an order for a customer that doesn't exist). SQL Server also uses transactions to protect
> modifications against meta-data so you risk having a physical corruption as well.
>
We tried to restore the backups, but it's asking for a file that
doesn't exist.
For example, we tell it to restore from Wednesday, but it asks for
Friday also. Which doesn't exist.
This isn't my machine directly but belongs to someone else that I'm
trying to help (freely of course :) ).
Doug|||> We tried to restore the backups
I assume this is backups done with the BACKUP DATABASE command?
> but it's asking for a file that
> doesn't exist.
Who is "it"? Use the RESTORE DATABASE command and possibly RESTORE LOG (if you also have log
backups).
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
<nerd1701@.gmail.com> wrote in message news:1191868968.424106.134380@.k79g2000hse.googlegroups.com...
> On Oct 8, 12:44 pm, "Tibor Karaszi"
> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
>> Basically, you will want to restore from your backup. Even if you find some hack to get this
>> database into SQL Server, your data is inconsistent. Whatever hack you might find will force some
>> new log file to be created but the reason it complains as it is is because SQL Server need the
>> log
>> file to do REDO and UNDO for transactions in order to get a consistent database. So the hack you
>> might find will leave you with "half performed transactions" i.e., logically inconsistent data
>> (like
>> an order for a customer that doesn't exist). SQL Server also uses transactions to protect
>> modifications against meta-data so you risk having a physical corruption as well.
> We tried to restore the backups, but it's asking for a file that
> doesn't exist.
> For example, we tell it to restore from Wednesday, but it asks for
> Friday also. Which doesn't exist.
> This isn't my machine directly but belongs to someone else that I'm
> trying to help (freely of course :) ).
> Doug
>|||On Oct 8, 2:59 pm, "Tibor Karaszi"
<tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> > We tried to restore the backups
> I assume this is backups done with the BACKUP DATABASE command?
> > but it's asking for a file that
> > doesn't exist.
> Who is "it"? Use the RESTORE DATABASE command and possibly RESTORE LOG (if you also have log
> backups).
> --
> Tibor Karaszi, SQL Server MVPhttp://www.karaszi.com/sqlserver/default.asphttp://sqlblog.com/blogs/tibor_karaszi
>
> <nerd1...@.gmail.com> wrote in messagenews:1191868968.424106.134380@.k79g2000hse.googlegroups.com...
> > On Oct 8, 12:44 pm, "Tibor Karaszi"
> > <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
> >> Basically, you will want to restore from your backup. Even if you find some hack to get this
> >> database into SQL Server, your data is inconsistent. Whatever hack you might find will force some
> >> new log file to be created but the reason it complains as it is is because SQL Server need the
> >> log
> >> file to do REDO and UNDO for transactions in order to get a consistent database. So the hack you
> >> might find will leave you with "half performed transactions" i.e., logically inconsistent data
> >> (like
> >> an order for a customer that doesn't exist). SQL Server also uses transactions to protect
> >> modifications against meta-data so you risk having a physical corruption as well.
We tried doing the restore through the enterprise manager.
Yes, this was done with the SQL backup manager.
> > We tried to restore the backups, but it's asking for a file that
> > doesn't exist.
> > For example, we tell it to restore from Wednesday, but it asks for
> > Friday also. Which doesn't exist.
> > This isn't my machine directly but belongs to someone else that I'm
> > trying to help (freely of course :) ).
> > Doug- Hide quoted text -
> - Show quoted text -|||> We tried doing the restore through the enterprise manager.
Try the RESTORE command instead. You never know what a GUI tool want to do, and it is also easier to
communicate commands and error messages this way.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://sqlblog.com/blogs/tibor_karaszi
<nerd1701@.gmail.com> wrote in message news:1191870417.832070.317830@.19g2000hsx.googlegroups.com...
> On Oct 8, 2:59 pm, "Tibor Karaszi"
> <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
>> > We tried to restore the backups
>> I assume this is backups done with the BACKUP DATABASE command?
>> > but it's asking for a file that
>> > doesn't exist.
>> Who is "it"? Use the RESTORE DATABASE command and possibly RESTORE LOG (if you also have log
>> backups).
>> --
>> Tibor Karaszi, SQL Server
>> MVPhttp://www.karaszi.com/sqlserver/default.asphttp://sqlblog.com/blogs/tibor_karaszi
>>
>> <nerd1...@.gmail.com> wrote in
>> messagenews:1191868968.424106.134380@.k79g2000hse.googlegroups.com...
>> > On Oct 8, 12:44 pm, "Tibor Karaszi"
>> > <tibor_please.no.email_kara...@.hotmail.nomail.com> wrote:
>> >> Basically, you will want to restore from your backup. Even if you find some hack to get this
>> >> database into SQL Server, your data is inconsistent. Whatever hack you might find will force
>> >> some
>> >> new log file to be created but the reason it complains as it is is because SQL Server need the
>> >> log
>> >> file to do REDO and UNDO for transactions in order to get a consistent database. So the hack
>> >> you
>> >> might find will leave you with "half performed transactions" i.e., logically inconsistent data
>> >> (like
>> >> an order for a customer that doesn't exist). SQL Server also uses transactions to protect
>> >> modifications against meta-data so you risk having a physical corruption as well.
> We tried doing the restore through the enterprise manager.
> Yes, this was done with the SQL backup manager.
>> > We tried to restore the backups, but it's asking for a file that
>> > doesn't exist.
>> > For example, we tell it to restore from Wednesday, but it asks for
>> > Friday also. Which doesn't exist.
>> > This isn't my machine directly but belongs to someone else that I'm
>> > trying to help (freely of course :) ).
>> > Doug- Hide quoted text -
>> - Show quoted text -
>sql

Having Problem with SQL Express 2005

I just started working with sql server 2005 and i have a windows-based application in C# with an attached sql server database file.Everything is ok but i don't know if i can create a password for that database.
there is no IDE for sql server 2005 like Enterprise Manager in SQL Server 2000 to do such things(Creating roles,passwords,...).In my computer I've installed VS 2005 Professional and SQL Express.I think i should do all thease thing with system stored procdures right? But it's really boring do all those things without an IDE.

Thanks in advance.

If you have a look at the following link you will find a download for SQL Server Management Studio Express, this is an IDE based of the full management studio (Replacement for Enterprise manager and Query tools) that can be used for the express editions.

http://msdn.microsoft.com/vstudio/express/sql/download/

|||Thank you

Having Problem While Importing a Text File

Hello everbody,
Our system is using Sql Server 2000 on Windows XP / Windows 2000
We have a text file needs to be imported into Sql Server 2000 as a
table.
But we are facing a problem which is,
Sql Server claims that it has a character size limit ( which is 8060 )
so it cant procceed the import operation if the text file has a record
bigger then 8060.
The records , in the text file, have a size bigger then 8060. So we
wont be able to import the text file.
On the other hand it is said that Sql Server 2005 can get a record
bigger then 8060 but
again we couldnt be able to perform the task.

As a result, i urgently need to know that how may i import the text
file which has a record bigger then 8060 characters.?
Any help is appreciated
thanks a lot!!

Tunc Ovacikpanic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

Our system is using Sql Server 2000 on Windows XP / Windows 2000
We have a text file needs to be imported into Sql Server 2000 as a
table.
But we are facing a problem which is,
Sql Server claims that it has a character size limit ( which is 8060 )
so it cant procceed the import operation if the text file has a record
bigger then 8060.
The records , in the text file, have a size bigger then 8060. So we
wont be able to import the text file.
On the other hand it is said that Sql Server 2005 can get a record
bigger then 8060 but
again we couldnt be able to perform the task.
>
As a result, i urgently need to know that how may i import the text
file which has a record bigger then 8060 characters.?
Any help is appreciated
thanks a lot!!


How do you import the file? BCP, BULK INSERT or DTS?

Could you post the CREATE TABLE statement for the table in question?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||--SNIP --

Quote:

Originally Posted by

The records , in the text file, have a size bigger then 8060. So we
wont be able to import the text file.
On the other hand it is said that Sql Server 2005 can get a record
bigger then 8060 but
again we couldnt be able to perform the task.
>
As a result, i urgently need to know that how may i import the text
file which has a record bigger then 8060 characters.?
Any help is appreciated


-- SNIP --

Good day,

If you're using a large data-type for a column (such as varchar(max),
nvarchar(max), varbinary(max), text, image, & xml), you can go beyond
the 8060 limit.

Alternatively, if you aren't using large data-types, you can vertically
partition the table so some of the columns would be in one table while
the other set of columns would be in another table.

Hope this helps.

Regards,
N.I.T.I.N.|||Erland Sommarskog wrote:

Quote:

Originally Posted by

panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

Our system is using Sql Server 2000 on Windows XP / Windows 2000
We have a text file needs to be imported into Sql Server 2000 as a
table.
But we are facing a problem which is,
Sql Server claims that it has a character size limit ( which is 8060 )
so it cant procceed the import operation if the text file has a record
bigger then 8060.
The records , in the text file, have a size bigger then 8060. So we
wont be able to import the text file.
On the other hand it is said that Sql Server 2005 can get a record
bigger then 8060 but
again we couldnt be able to perform the task.

As a result, i urgently need to know that how may i import the text
file which has a record bigger then 8060 characters.?
Any help is appreciated
thanks a lot!!


>
How do you import the file? BCP, BULK INSERT or DTS?
>
Could you post the CREATE TABLE statement for the table in question?
>
>
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx


hi again...
thanks for your concern... i really appreciated...
we are importing the text file by using DTS
here is the create table statement used by DTS :

CREATE TABLE [nwind].[dbo].[DDD] (
[Col001] varchar (255) NULL,
[Col002] varchar (255) NULL,
[Col003] varchar (255) NULL,
[Col004] varchar (255) NULL,
[Col005] varchar (255) NULL,
[Col006] varchar (255) NULL,
[Col007] varchar (255) NULL,
[Col008] varchar (255) NULL,
[Col009] varchar (255) NULL,
[Col010] varchar (255) NULL,
[Col011] varchar (255) NULL,
[Col012] varchar (255) NULL,
[Col013] varchar (255) NULL,
[Col014] varchar (255) NULL,
[Col015] varchar (255) NULL,
[Col016] varchar (255) NULL,
[Col017] varchar (255) NULL,
[Col018] varchar (255) NULL,
[Col019] varchar (255) NULL,
[Col020] varchar (255) NULL,
[Col021] varchar (255) NULL,
[Col022] varchar (255) NULL,
[Col023] varchar (255) NULL,
[Col024] varchar (255) NULL,
[Col025] varchar (255) NULL,
[Col026] varchar (255) NULL,
[Col027] varchar (255) NULL,
[Col028] varchar (255) NULL,
[Col029] varchar (255) NULL,
[Col030] varchar (255) NULL,
[Col031] varchar (255) NULL,
[Col032] varchar (255) NULL,
[Col033] varchar (255) NULL,
[Col034] varchar (255) NULL,
[Col035] varchar (255) NULL,
[Col036] varchar (255) NULL,
[Col037] varchar (255) NULL,
[Col038] varchar (255) NULL,
[Col039] varchar (255) NULL,
[Col040] varchar (255) NULL,
[Col041] varchar (255) NULL,
[Col042] varchar (255) NULL,
[Col043] varchar (255) NULL,
[Col044] varchar (255) NULL,
[Col045] varchar (255) NULL,
[Col046] varchar (255) NULL,
[Col047] varchar (255) NULL,
[Col048] varchar (255) NULL,
[Col049] varchar (255) NULL,
[Col050] varchar (255) NULL,
[Col051] varchar (255) NULL,
[Col052] varchar (255) NULL,
[Col053] varchar (255) NULL,
[Col054] varchar (255) NULL,
[Col055] varchar (255) NULL,
[Col056] varchar (255) NULL,
[Col057] varchar (255) NULL,
[Col058] varchar (255) NULL,
[Col059] varchar (255) NULL,
[Col060] varchar (255) NULL,
[Col061] varchar (255) NULL,
[Col062] varchar (255) NULL,
[Col063] varchar (255) NULL,
[Col064] varchar (255) NULL,
[Col065] varchar (255) NULL,
[Col066] varchar (255) NULL,
[Col067] varchar (255) NULL,
[Col068] varchar (255) NULL,
[Col069] varchar (255) NULL,
[Col070] varchar (255) NULL,
[Col071] varchar (255) NULL,
[Col072] varchar (255) NULL,
[Col073] varchar (255) NULL,
[Col074] varchar (255) NULL,
[Col075] varchar (255) NULL,
[Col076] varchar (255) NULL,
[Col077] varchar (255) NULL,
[Col078] varchar (255) NULL,
[Col079] varchar (255) NULL,
[Col080] varchar (255) NULL,
[Col081] varchar (255) NULL,
[Col082] varchar (255) NULL,
[Col083] varchar (255) NULL,
[Col084] varchar (255) NULL,
[Col085] varchar (255) NULL,
[Col086] varchar (255) NULL,
[Col087] varchar (255) NULL,
[Col088] varchar (255) NULL,
[Col089] varchar (255) NULL,
[Col090] varchar (255) NULL,
[Col091] varchar (255) NULL,
[Col092] varchar (255) NULL,
[Col093] varchar (255) NULL,
[Col094] varchar (255) NULL,
[Col095] varchar (255) NULL,
[Col096] varchar (255) NULL,
[Col097] varchar (255) NULL,
[Col098] varchar (255) NULL,
[Col099] varchar (255) NULL,
[Col100] varchar (255) NULL,
[Col101] varchar (255) NULL,
[Col102] varchar (255) NULL,
[Col103] varchar (255) NULL,
[Col104] varchar (255) NULL,
[Col105] varchar (255) NULL,
[Col106] varchar (255) NULL,
[Col107] varchar (255) NULL,
[Col108] varchar (255) NULL,
[Col109] varchar (255) NULL,
[Col110] varchar (255) NULL,
[Col111] varchar (255) NULL,
[Col112] varchar (255) NULL,
[Col113] varchar (255) NULL,
[Col114] varchar (255) NULL,
[Col115] varchar (255) NULL,
[Col116] varchar (255) NULL,
[Col117] varchar (255) NULL,
[Col118] varchar (255) NULL,
[Col119] varchar (255) NULL,
[Col120] varchar (255) NULL,
[Col121] varchar (255) NULL,
[Col122] varchar (255) NULL,
[Col123] varchar (255) NULL,
[Col124] varchar (255) NULL,
[Col125] varchar (255) NULL,
[Col126] varchar (255) NULL,
[Col127] varchar (255) NULL,
[Col128] varchar (255) NULL,
[Col129] varchar (255) NULL,
[Col130] varchar (255) NULL,
[Col131] varchar (255) NULL,
[Col132] varchar (255) NULL,
[Col133] varchar (255) NULL,
[Col134] varchar (255) NULL,
[Col135] varchar (255) NULL,
[Col136] varchar (255) NULL,
[Col137] varchar (255) NULL,
[Col138] varchar (255) NULL,
[Col139] varchar (255) NULL,
[Col140] varchar (255) NULL,
[Col141] varchar (255) NULL,
[Col142] varchar (255) NULL,
[Col143] varchar (255) NULL,
[Col144] varchar (255) NULL,
[Col145] varchar (255) NULL,
[Col146] varchar (255) NULL,
[Col147] varchar (255) NULL,
[Col148] varchar (255) NULL,
[Col149] varchar (255) NULL,
[Col150] varchar (255) NULL,
[Col151] varchar (255) NULL,
[Col152] varchar (255) NULL,
[Col153] varchar (255) NULL,
[Col154] varchar (255) NULL,
[Col155] varchar (255) NULL,
[Col156] varchar (255) NULL,
[Col157] varchar (255) NULL,
[Col158] varchar (255) NULL,
[Col159] varchar (255) NULL,
[Col160] varchar (255) NULL,
[Col161] varchar (255) NULL,
[Col162] varchar (255) NULL,
[Col163] varchar (255) NULL,
[Col164] varchar (255) NULL,
[Col165] varchar (255) NULL,
[Col166] varchar (255) NULL,
[Col167] varchar (255) NULL,
[Col168] varchar (255) NULL,
[Col169] varchar (255) NULL,
[Col170] varchar (255) NULL,
[Col171] varchar (255) NULL,
[Col172] varchar (255) NULL,
[Col173] varchar (255) NULL,
[Col174] varchar (255) NULL,
[Col175] varchar (255) NULL,
[Col176] varchar (255) NULL,
[Col177] varchar (255) NULL,
[Col178] varchar (255) NULL,
[Col179] varchar (255) NULL,
[Col180] varchar (255) NULL,
[Col181] varchar (255) NULL,
[Col182] varchar (255) NULL,
[Col183] varchar (255) NULL,
[Col184] varchar (255) NULL,
[Col185] varchar (255) NULL,
[Col186] varchar (255) NULL,
[Col187] varchar (255) NULL,
[Col188] varchar (255) NULL,
[Col189] varchar (255) NULL,
[Col190] varchar (255) NULL,
[Col191] varchar (255) NULL,
[Col192] varchar (255) NULL,
[Col193] varchar (255) NULL,
[Col194] varchar (255) NULL,
[Col195] varchar (255) NULL,
[Col196] varchar (255) NULL,
[Col197] varchar (255) NULL,
[Col198] varchar (255) NULL,
[Col199] varchar (255) NULL,
[Col200] varchar (255) NULL,
[Col201] varchar (255) NULL,
[Col202] varchar (255) NULL,
[Col203] varchar (255) NULL,
[Col204] varchar (255) NULL,
[Col205] varchar (255) NULL,
[Col206] varchar (255) NULL,
[Col207] varchar (255) NULL,
[Col208] varchar (255) NULL,
[Col209] varchar (255) NULL,
[Col210] varchar (255) NULL,
[Col211] varchar (255) NULL,
[Col212] varchar (255) NULL,
[Col213] varchar (255) NULL,
[Col214] varchar (255) NULL,
[Col215] varchar (255) NULL,
[Col216] varchar (255) NULL,
[Col217] varchar (255) NULL,
[Col218] varchar (255) NULL,
[Col219] varchar (255) NULL,
[Col220] varchar (255) NULL,
[Col221] varchar (255) NULL,
[Col222] varchar (255) NULL,
[Col223] varchar (255) NULL,
[Col224] varchar (255) NULL,
[Col225] varchar (255) NULL,
[Col226] varchar (255) NULL,
[Col227] varchar (255) NULL,
[Col228] varchar (255) NULL,
[Col229] varchar (255) NULL,
[Col230] varchar (255) NULL,
[Col231] varchar (255) NULL,
[Col232] varchar (255) NULL,
[Col233] varchar (255) NULL,
[Col234] varchar (255) NULL,
[Col235] varchar (255) NULL,
[Col236] varchar (255) NULL,
[Col237] varchar (255) NULL,
[Col238] varchar (255) NULL,
[Col239] varchar (255) NULL,
[Col240] varchar (255) NULL,
[Col241] varchar (255) NULL,
[Col242] varchar (255) NULL,
[Col243] varchar (255) NULL,
[Col244] varchar (255) NULL,
[Col245] varchar (255) NULL,
[Col246] varchar (255) NULL,
[Col247] varchar (255) NULL,
[Col248] varchar (255) NULL,
[Col249] varchar (255) NULL,
[Col250] varchar (255) NULL,
[Col251] varchar (255) NULL,
[Col252] varchar (255) NULL,
[Col253] varchar (255) NULL,
[Col254] varchar (255) NULL,
[Col255] varchar (255) NULL,
[Col256] varchar (255) NULL,
[Col257] varchar (255) NULL,
[Col258] varchar (255) NULL,
[Col259] varchar (255) NULL,
[Col260] varchar (255) NULL,
[Col261] varchar (255) NULL,
[Col262] varchar (255) NULL,
[Col263] varchar (255) NULL,
[Col264] varchar (255) NULL,
[Col265] varchar (255) NULL,
[Col266] varchar (255) NULL,
[Col267] varchar (255) NULL,
[Col268] varchar (255) NULL,
[Col269] varchar (255) NULL,
[Col270] varchar (255) NULL,
[Col271] varchar (255) NULL,
[Col272] varchar (255) NULL,
[Col273] varchar (255) NULL,
[Col274] varchar (255) NULL,
[Col275] varchar (255) NULL,
[Col276] varchar (255) NULL,
[Col277] varchar (255) NULL,
[Col278] varchar (255) NULL,
[Col279] varchar (255) NULL,
[Col280] varchar (255) NULL,
[Col281] varchar (255) NULL,
[Col282] varchar (255) NULL,
[Col283] varchar (255) NULL,
[Col284] varchar (255) NULL,
[Col285] varchar (255) NULL,
[Col286] varchar (255) NULL,
[Col287] varchar (255) NULL,
[Col288] varchar (255) NULL,
[Col289] varchar (255) NULL,
[Col290] varchar (255) NULL,
[Col291] varchar (255) NULL,
[Col292] varchar (255) NULL,
[Col293] varchar (255) NULL,
[Col294] varchar (255) NULL,
[Col295] varchar (255) NULL,
[Col296] varchar (255) NULL,
[Col297] varchar (255) NULL,
[Col298] varchar (255) NULL,
[Col299] varchar (255) NULL,
[Col300] varchar (255) NULL,
[Col301] varchar (255) NULL,
[Col302] varchar (255) NULL,
[Col303] varchar (255) NULL,
[Col304] varchar (255) NULL,
[Col305] varchar (255) NULL,
[Col306] varchar (255) NULL,
[Col307] varchar (255) NULL,
[Col308] varchar (255) NULL,
[Col309] varchar (255) NULL,
[Col310] varchar (255) NULL,
[Col311] varchar (255) NULL,
[Col312] varchar (255) NULL,
[Col313] varchar (255) NULL,
[Col314] varchar (255) NULL,
[Col315] varchar (255) NULL,
[Col316] varchar (255) NULL,
[Col317] varchar (255) NULL,
[Col318] varchar (255) NULL,
[Col319] varchar (255) NULL,
[Col320] varchar (255) NULL,
[Col321] varchar (255) NULL,
[Col322] varchar (255) NULL,
[Col323] varchar (255) NULL,
[Col324] varchar (255) NULL,
[Col325] varchar (255) NULL,
[Col326] varchar (255) NULL,
[Col327] varchar (255) NULL,
[Col328] varchar (255) NULL,
[Col329] varchar (255) NULL,
[Col330] varchar (255) NULL,
[Col331] varchar (255) NULL,
[Col332] varchar (255) NULL,
[Col333] varchar (255) NULL,
[Col334] varchar (255) NULL,
[Col335] varchar (255) NULL,
[Col336] varchar (255) NULL,
[Col337] varchar (255) NULL,
[Col338] varchar (255) NULL,
[Col339] varchar (255) NULL,
[Col340] varchar (255) NULL,
[Col341] varchar (255) NULL,
[Col342] varchar (255) NULL,
[Col343] varchar (255) NULL,
[Col344] varchar (255) NULL,
[Col345] varchar (255) NULL,
[Col346] varchar (255) NULL,
[Col347] varchar (255) NULL,
[Col348] varchar (255) NULL,
[Col349] varchar (255) NULL,
[Col350] varchar (255) NULL,
[Col351] varchar (255) NULL,
[Col352] varchar (255) NULL,
[Col353] varchar (255) NULL,
[Col354] varchar (255) NULL,
[Col355] varchar (255) NULL,
[Col356] varchar (255) NULL,
[Col357] varchar (255) NULL,
[Col358] varchar (255) NULL,
[Col359] varchar (255) NULL,
[Col360] varchar (255) NULL,
[Col361] varchar (255) NULL,
[Col362] varchar (255) NULL,
[Col363] varchar (255) NULL,
[Col364] varchar (255) NULL,
[Col365] varchar (255) NULL,
[Col366] varchar (255) NULL,
[Col367] varchar (255) NULL,
[Col368] varchar (255) NULL,
[Col369] varchar (255) NULL,
[Col370] varchar (255) NULL,
[Col371] varchar (255) NULL,
[Col372] varchar (255) NULL,
[Col373] varchar (255) NULL,
[Col374] varchar (255) NULL,
[Col375] varchar (255) NULL,
[Col376] varchar (255) NULL,
[Col377] varchar (255) NULL,
[Col378] varchar (255) NULL,
[Col379] varchar (255) NULL,
[Col380] varchar (255) NULL,
[Col381] varchar (255) NULL,
[Col382] varchar (255) NULL,
[Col383] varchar (255) NULL,
[Col384] varchar (255) NULL,
[Col385] varchar (255) NULL,
[Col386] varchar (255) NULL,
[Col387] varchar (255) NULL,
[Col388] varchar (255) NULL,
[Col389] varchar (255) NULL,
[Col390] varchar (255) NULL,
[Col391] varchar (255) NULL,
[Col392] varchar (255) NULL,
[Col393] varchar (255) NULL,
[Col394] varchar (255) NULL,
[Col395] varchar (255) NULL,
[Col396] varchar (255) NULL,
[Col397] varchar (255) NULL,
[Col398] varchar (255) NULL,
[Col399] varchar (255) NULL,
[Col400] varchar (255) NULL,
[Col401] varchar (255) NULL,
[Col402] varchar (255) NULL,
[Col403] varchar (255) NULL,
[Col404] varchar (255) NULL,
[Col405] varchar (255) NULL,
[Col406] varchar (255) NULL,
[Col407] varchar (255) NULL,
[Col408] varchar (255) NULL,
[Col409] varchar (255) NULL,
[Col410] varchar (255) NULL,
[Col411] varchar (255) NULL,
[Col412] varchar (255) NULL,
[Col413] varchar (255) NULL,
[Col414] varchar (255) NULL,
[Col415] varchar (255) NULL,
[Col416] varchar (255) NULL,
[Col417] varchar (255) NULL,
[Col418] varchar (255) NULL,
[Col419] varchar (255) NULL,
[Col420] varchar (255) NULL,
[Col421] varchar (255) NULL,
[Col422] varchar (255) NULL,
[Col423] varchar (255) NULL,
[Col424] varchar (255) NULL,
[Col425] varchar (255) NULL,
[Col426] varchar (255) NULL,
[Col427] varchar (255) NULL,
[Col428] varchar (255) NULL,
[Col429] varchar (255) NULL,
[Col430] varchar (255) NULL,
[Col431] varchar (255) NULL,
[Col432] varchar (255) NULL,
[Col433] varchar (255) NULL,
[Col434] varchar (255) NULL,
[Col435] varchar (255) NULL,
[Col436] varchar (255) NULL,
[Col437] varchar (255) NULL,
[Col438] varchar (255) NULL,
[Col439] varchar (255) NULL,
[Col440] varchar (255) NULL,
[Col441] varchar (255) NULL,
[Col442] varchar (255) NULL,
[Col443] varchar (255) NULL,
[Col444] varchar (255) NULL,
[Col445] varchar (255) NULL,
[Col446] varchar (255) NULL,
[Col447] varchar (255) NULL,
[Col448] varchar (255) NULL,
[Col449] varchar (255) NULL,
[Col450] varchar (255) NULL,
[Col451] varchar (255) NULL,
[Col452] varchar (255) NULL,
[Col453] varchar (255) NULL,
[Col454] varchar (255) NULL,
[Col455] varchar (255) NULL,
[Col456] varchar (255) NULL,
[Col457] varchar (255) NULL,
[Col458] varchar (255) NULL,
[Col459] varchar (255) NULL,
[Col460] varchar (255) NULL,
[Col461] varchar (255) NULL,
[Col462] varchar (255) NULL,
[Col463] varchar (255) NULL,
[Col464] varchar (255) NULL,
[Col465] varchar (255) NULL,
[Col466] varchar (255) NULL,
[Col467] varchar (255) NULL,
[Col468] varchar (255) NULL,
[Col469] varchar (255) NULL,
[Col470] varchar (255) NULL,
[Col471] varchar (255) NULL,
[Col472] varchar (255) NULL,
[Col473] varchar (255) NULL,
[Col474] varchar (255) NULL,
[Col475] varchar (255) NULL,
[Col476] varchar (255) NULL,
[Col477] varchar (255) NULL,
[Col478] varchar (255) NULL,
[Col479] varchar (255) NULL,
[Col480] varchar (255) NULL,
[Col481] varchar (255) NULL,
[Col482] varchar (255) NULL,
[Col483] varchar (255) NULL,
[Col484] varchar (255) NULL,
[Col485] varchar (255) NULL,
[Col486] varchar (255) NULL,
[Col487] varchar (255) NULL,
[Col488] varchar (255) NULL,
[Col489] varchar (255) NULL,
[Col490] varchar (255) NULL,
[Col491] varchar (255) NULL,
[Col492] varchar (255) NULL,
[Col493] varchar (255) NULL,
[Col494] varchar (255) NULL,
[Col495] varchar (255) NULL,
[Col496] varchar (255) NULL,
[Col497] varchar (255) NULL,
[Col498] varchar (255) NULL,
[Col499] varchar (255) NULL,
[Col500] varchar (255) NULL,
[Col501] varchar (255) NULL,
[Col502] varchar (255) NULL,
[Col503] varchar (255) NULL,
[Col504] varchar (255) NULL,
[Col505] varchar (255) NULL,
[Col506] varchar (255) NULL,
[Col507] varchar (255) NULL,
[Col508] varchar (255) NULL,
[Col509] varchar (255) NULL,
[Col510] varchar (255) NULL,
[Col511] varchar (255) NULL,
[Col512] varchar (255) NULL,
[Col513] varchar (255) NULL,
[Col514] varchar (255) NULL,
[Col515] varchar (255) NULL,
[Col516] varchar (255) NULL,
[Col517] varchar (255) NULL,
[Col518] varchar (255) NULL,
[Col519] varchar (255) NULL,
[Col520] varchar (255) NULL,
[Col521] varchar (255) NULL,
[Col522] varchar (255) NULL,
[Col523] varchar (255) NULL,
[Col524] varchar (255) NULL,
[Col525] varchar (255) NULL,
[Col526] varchar (255) NULL,
[Col527] varchar (255) NULL,
[Col528] varchar (255) NULL,
[Col529] varchar (255) NULL,
[Col530] varchar (255) NULL,
[Col531] varchar (255) NULL,
[Col532] varchar (255) NULL,
[Col533] varchar (255) NULL,
[Col534] varchar (255) NULL,
[Col535] varchar (255) NULL,
[Col536] varchar (255) NULL,
[Col537] varchar (255) NULL,
[Col538] varchar (255) NULL,
[Col539] varchar (255) NULL,
[Col540] varchar (255) NULL,
[Col541] varchar (255) NULL,
[Col542] varchar (255) NULL,
[Col543] varchar (255) NULL
)

and DTS is reporting an error like this :
"
Error at destination for row number 13024. Errors encountered so far
in this task: 1.
The statement has been terminated.
Cannot create a row of size 9997 which is greater than the allowable
maximum of 8060.
"
i also tried "nvarchar" , "ntext" ect. but none of them worked. :((
if you need any further information about the proccess please let me
know
i will respond/answer as soon as possible.
thanks a lot again...

Tunc Ovacik|||panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

CREATE TABLE [nwind].[dbo].[DDD] (
[Col001] varchar (255) NULL,
[Col002] varchar (255) NULL,
[Col003] varchar (255) NULL,
>...
[Col543] varchar (255) NULL
)


The table appears somewhat funny. Does the table really reflect your
business rules? 255 * 543 is 138465 and with a maximum row size of
8060 in SQL Server, this is not like to turn out well.

Quote:

Originally Posted by

i also tried "nvarchar" , "ntext" ect. but none of them worked. :((


If you tried 543 ntext columns, I can understand why that fails. A ntext
column has a 16-byte point which is in the the row, and the real data is
elsewhere. 543 * 16 is 8688, so you can't have all those text pointers
on a single row.

Does your input file really have 543 input fields?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||panic attack wrote:

Quote:

Originally Posted by

Erland Sommarskog wrote:

Quote:

Originally Posted by

panic attack (tunc.ovacik@.gmail.com) writes:


-- SNIP --

Quote:

Originally Posted by

CREATE TABLE [nwind].[dbo].[DDD] (
[Col001] varchar (255) NULL,


-- SNIP --

Quote:

Originally Posted by

[Col543] varchar (255) NULL
)


-- SNIP --

Hi!

I know I said earlier that you could use long data types, and the table
you're trying to create is in accordance with my statement earlier.
However, I should have added that each column stores a pointer in the
row and the actual data is stored in a different location. If you add
up the sizes of the pointers along with other row data they should be
below the ~8K limit too. I guess I should be more accurate when I say
something in future like those people who speak legal-ese. The DDL
query was really funny to look at, and it's the first time I ever used
the "read more" link on Google Groups.

I would suggest that you partition your tables vertically so you have
some of the columns in one table and the other columns in another table
(...or perhaps more than 2 tables, depending on the sizes... I'm not
really good at the math).

N.I.T.I.N.

PS: I hope I never have to deal with such a monstrosity - a table that
has so many columns. I once had to deal with 36 columns and that was
too much for me as a developer (that was before my days as a DBA). I
split it up into 3 tables though people may say it is less efficient to
have 3 queries instead of one (remember the days when people said you
should use assembly language as the code is smaller & faster?).

PPS: No offence to assembly language developers in the last 'PS'. I
totally respect people who still use assembly, but for me it's just a
little too much source code to think straight - I'd spend a whole hour
doing something that I could do in 15 minutes with VB, Java or C# (when
equipped with the right IDE, of course!).|||ofcourse it has 543 columns :)))
the data includes records for about 6 years, 20 quarters and 60 months
back data and for each period it has 6 parameters and some other
columns info(text).
so if you do the math;
(6 * 6) + ( 20 * 6 ) + ( 60 * 6 ) = 516 columns.

and if you add the other columns the result is 543
as i said i tried various types to get the data in to SQL
nvarchar , varchar , ntext etc.
but none of them worked out.

what do you think Erland? do we have a chance to get over this problem?
or it is not possible to get the data into SQL Server 2000?

NOTE : By the way there is another data that we are importing to Sql
Server 2000.
and it has 124 columns. no problems occur while getting the data into
Sql Server 2000. if we apply the same logic as you did ,
124 * 255 = 31620
so it is also bigger than 8060. but we are doing the operation without
any problems.
it seems that there is a contradiction doesnt it?

what about SQL Server 2005? is there any limitation at sql server 2005
about the row size?

Tunc

Erland Sommarskog wrote:

Quote:

Originally Posted by

panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

CREATE TABLE [nwind].[dbo].[DDD] (
[Col001] varchar (255) NULL,
[Col002] varchar (255) NULL,
[Col003] varchar (255) NULL,
...
[Col543] varchar (255) NULL
)


>
The table appears somewhat funny. Does the table really reflect your
business rules? 255 * 543 is 138465 and with a maximum row size of
8060 in SQL Server, this is not like to turn out well.
>

Quote:

Originally Posted by

i also tried "nvarchar" , "ntext" ect. but none of them worked. :((


>
If you tried 543 ntext columns, I can understand why that fails. A ntext
column has a 16-byte point which is in the the row, and the real data is
elsewhere. 543 * 16 is 8688, so you can't have all those text pointers
on a single row.
>
Does your input file really have 543 input fields?
>
>
>
>
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

|||hi again...
partitioning the table is one of the solvation but for our production
system unfortunately it is not proper for use. :((
cause we have lots of clients and if we partition the tables for our
each client there are gonna be enourmus number of tables, so it is not
possible to deal with those number of tables right now...

hence, we need to get the data at once, in one table.
perhaps we can union some columns into one column. but again there will
be some leck of use of the data while manipulating it.
as you can see it seems not good... :((

i hope that erland may advise another solvation about the problem.
or we may upgrade the database to sql server 2005 if it is gonna help
us getting the data into one table without any problems.

i really appreciated for your help.
thanks a lot.
best regards.

Tunc

NiTiN yazdi:

Quote:

Originally Posted by

panic attack wrote:

Quote:

Originally Posted by

Erland Sommarskog wrote:

Quote:

Originally Posted by

panic attack (tunc.ovacik@.gmail.com) writes:


>
-- SNIP --
>

Quote:

Originally Posted by

CREATE TABLE [nwind].[dbo].[DDD] (
[Col001] varchar (255) NULL,


>
-- SNIP --
>

Quote:

Originally Posted by

[Col543] varchar (255) NULL
)


>
-- SNIP --
>
>
Hi!
>
I know I said earlier that you could use long data types, and the table
you're trying to create is in accordance with my statement earlier.
However, I should have added that each column stores a pointer in the
row and the actual data is stored in a different location. If you add
up the sizes of the pointers along with other row data they should be
below the ~8K limit too. I guess I should be more accurate when I say
something in future like those people who speak legal-ese. The DDL
query was really funny to look at, and it's the first time I ever used
the "read more" link on Google Groups.
>
I would suggest that you partition your tables vertically so you have
some of the columns in one table and the other columns in another table
(...or perhaps more than 2 tables, depending on the sizes... I'm not
really good at the math).
>
N.I.T.I.N.
>
PS: I hope I never have to deal with such a monstrosity - a table that
has so many columns. I once had to deal with 36 columns and that was
too much for me as a developer (that was before my days as a DBA). I
split it up into 3 tables though people may say it is less efficient to
have 3 queries instead of one (remember the days when people said you
should use assembly language as the code is smaller & faster?).
>
PPS: No offence to assembly language developers in the last 'PS'. I
totally respect people who still use assembly, but for me it's just a
little too much source code to think straight - I'd spend a whole hour
doing something that I could do in 15 minutes with VB, Java or C# (when
equipped with the right IDE, of course!).

|||panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

ofcourse it has 543 columns :)))
the data includes records for about 6 years, 20 quarters and 60 months
back data and for each period it has 6 parameters and some other
columns info(text).
so if you do the math;
(6 * 6) + ( 20 * 6 ) + ( 60 * 6 ) = 516 columns.


That sounds like 516 rows rows to me. Not 516 columns. At least with a
proper data model. Or this a staging table?

Quote:

Originally Posted by

what do you think Erland? do we have a chance to get over this problem?
or it is not possible to get the data into SQL Server 2000?


As NiTiN said, you will have to split the table in two vertically. Note
that it does not have to affect queries, as you can construct views that
combine them. You would then have to use a format file to make it possible
to only selected columns.

Quote:

Originally Posted by

NOTE : By the way there is another data that we are importing to Sql
Server 2000.
and it has 124 columns. no problems occur while getting the data into
Sql Server 2000. if we apply the same logic as you did ,
124 * 255 = 31620
so it is also bigger than 8060. but we are doing the operation without
any problems.
it seems that there is a contradiction doesnt it?


No. What matters is the actual row size, not the possible max.

Quote:

Originally Posted by

what about SQL Server 2005? is there any limitation at sql server 2005
about the row size?


No. SQL 2005 is yet another option. SQL 2005 permits rows to span multiple
pages.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||thanks for your fast answers.
there is one last thing that i need to ask...!!
now i decided to partition the data vertically
at this point there is one thing i need to ask...

now here is the case :

after partitioning the table it is gonna look like this:

table 1
---------------
column1 column2 ... column250
record1 record2 ... record250

table2
----------------
column1 column2 ... column250
record1 record2 ... record250

at this point i need to combine these tables( mentioned above)
vertically right?

how am i gonna do the combine operation after partitioning the table
into 2 or 3?

i tried to combine them by using "UNION" operator but i guess it works
for combining the tables horizontally.

thanks a lot
best regards.

tunc ovacik

Erland Sommarskog yazdi:

Quote:

Originally Posted by

panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

ofcourse it has 543 columns :)))
the data includes records for about 6 years, 20 quarters and 60 months
back data and for each period it has 6 parameters and some other
columns info(text).
so if you do the math;
(6 * 6) + ( 20 * 6 ) + ( 60 * 6 ) = 516 columns.


>
That sounds like 516 rows rows to me. Not 516 columns. At least with a
proper data model. Or this a staging table?
>

Quote:

Originally Posted by

what do you think Erland? do we have a chance to get over this problem?
or it is not possible to get the data into SQL Server 2000?


>
As NiTiN said, you will have to split the table in two vertically. Note
that it does not have to affect queries, as you can construct views that
combine them. You would then have to use a format file to make it possible
to only selected columns.
>

Quote:

Originally Posted by

NOTE : By the way there is another data that we are importing to Sql
Server 2000.
and it has 124 columns. no problems occur while getting the data into
Sql Server 2000. if we apply the same logic as you did ,
124 * 255 = 31620
so it is also bigger than 8060. but we are doing the operation without
any problems.
it seems that there is a contradiction doesnt it?


>
No. What matters is the actual row size, not the possible max.
>

Quote:

Originally Posted by

what about SQL Server 2005? is there any limitation at sql server 2005
about the row size?


>
No. SQL 2005 is yet another option. SQL 2005 permits rows to span multiple
pages.
>
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

|||On 4 Aug 2006 01:17:45 -0700, "panic attack" <tunc.ovacik@.gmail.com>
wrote:

Quote:

Originally Posted by

>partitioning the table is one of the solvation but for our production
>system unfortunately it is not proper for use. :((
>cause we have lots of clients and if we partition the tables for our
>each client there are gonna be enourmus number of tables, so it is not
>possible to deal with those number of tables right now...


The idea was not to partition the table by client, but to normalize it
so that the time periods are rows, not columns (for one example.)

My suggestion is to define multiple staging tables, each with a subset
of the columns. All would have to include the key column(s), then
each would include a different part of the rest. One data import for
each table, of course, selective on columns. Then when the data is
in, JOIN on the keys.

Roy Harvey
Beacon Falls, CT|||panic attack (tunc.ovacik@.gmail.com) writes:

Quote:

Originally Posted by

thanks for your fast answers.
there is one last thing that i need to ask...!!
now i decided to partition the data vertically
at this point there is one thing i need to ask...
>
now here is the case :
>
after partitioning the table it is gonna look like this:
>
table 1
---------------
column1 column2 ... column250
record1 record2 ... record250
>
>
table2
----------------
column1 column2 ... column250
record1 record2 ... record250
>
at this point i need to combine these tables( mentioned above)
vertically right?
>
how am i gonna do the combine operation after partitioning the table
into 2 or 3?


Hopefully there is a key in the data you import. Else you are in dire
straits. Say that columns 1 and 2 are the keys. Then you could define
a view as:

CREATE VIEW united AS
SELECT a.col1, a.col2, ... a.col250,
b.col251, ... b.col543
FROM tbl1 a
JOIN tbl2 b ON a.col1 = b.col1
AND a.col2 = b.col2
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspxsql

Having difficulty setting Back Up to back up file wihout datetime stamp SQL 2K

Hello,

I'm trying to create a simple back up in the SQL Maintenance Plan that will
make a single back up copy of all database every night at 10 pm. I'd like
the previous nights file to be overwritten, so there will be only a single
back up file for each database (tape back up runs every night, so each days
back up will be saved on tape).

Every night the maintenance plan makes a back up of all the databases to a
new file with a datetime stamp, meaning the previous nights file still
exists. Even when I check "Remove files older than 22 hours" the previous
nights file still exists. Is there any way to create a back up file without
the date time stamp so it overwrites the previous nights file?

Thanks!
RickOn 24.04.2007 15:29, Rico wrote:

Quote:

Originally Posted by

I'm trying to create a simple back up in the SQL Maintenance Plan that will
make a single back up copy of all database every night at 10 pm. I'd like
the previous nights file to be overwritten, so there will be only a single
back up file for each database (tape back up runs every night, so each days
back up will be saved on tape).
>
Every night the maintenance plan makes a back up of all the databases to a
new file with a datetime stamp, meaning the previous nights file still
exists. Even when I check "Remove files older than 22 hours" the previous
nights file still exists. Is there any way to create a back up file without
the date time stamp so it overwrites the previous nights file?


Yes, I believe option is "WITH INIT". Please see BOL for details.

Regards

robert|||Hi Robert,

Thanks, but I should have mentioned that I'm using SQL 2000 and I'm not
using T-SQL, just trying to create the Maintenance plan from the Enterprise
Manager.

Rick

"Robert Klemme" <shortcutter@.googlemail.comwrote in message
news:596ghdF2jtij6U1@.mid.individual.net...

Quote:

Originally Posted by

On 24.04.2007 15:29, Rico wrote:

Quote:

Originally Posted by

>I'm trying to create a simple back up in the SQL Maintenance Plan that
>will
>make a single back up copy of all database every night at 10 pm. I'd
>like
>the previous nights file to be overwritten, so there will be only a
>single
>back up file for each database (tape back up runs every night, so each
>days
>back up will be saved on tape).
>>
>Every night the maintenance plan makes a back up of all the databases to
>a
>new file with a datetime stamp, meaning the previous nights file still
>exists. Even when I check "Remove files older than 22 hours" the
>previous
>nights file still exists. Is there any way to create a back up file
>without
>the date time stamp so it overwrites the previous nights file?


>
Yes, I believe option is "WITH INIT". Please see BOL for details.
>
Regards
>
robert

|||Rico (me@.you.com) writes:

Quote:

Originally Posted by

Thanks, but I should have mentioned that I'm using SQL 2000 and I'm not
using T-SQL, just trying to create the Maintenance plan from the
Enterprise Manager.


From what you described, you should not use a maintenance plan. Just set up
a scheduled job to run the BACKUP statement. You can do this from the
GUI where you backup databases, and select Schedule somewhere on a button.
In the end you get a one-step job that has a BACKUP job.

Then again, if you have any interest in acquiring basic DBA skills, you
should certainly learn to write basic BACKUP commands in T-SQL.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||Thanks Erland,

I don't have a problem creating the T-SQL commands, it's using the
Enterprise Manager to do more than create dbs is where I get lost.

I will give that a try (creating a back up job and scheduling)

THanks!
Rick

"Erland Sommarskog" <esquel@.sommarskog.sewrote in message
news:Xns991D5A7E496DYazorman@.127.0.0.1...

Quote:

Originally Posted by

Rico (me@.you.com) writes:

Quote:

Originally Posted by

>Thanks, but I should have mentioned that I'm using SQL 2000 and I'm not
>using T-SQL, just trying to create the Maintenance plan from the
>Enterprise Manager.


>
From what you described, you should not use a maintenance plan. Just set
up
a scheduled job to run the BACKUP statement. You can do this from the
GUI where you backup databases, and select Schedule somewhere on a button.
In the end you get a one-step job that has a BACKUP job.
>
Then again, if you have any interest in acquiring basic DBA skills, you
should certainly learn to write basic BACKUP commands in T-SQL.
>
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

|||
"Rico" <me@.you.comwrote in message news:8LvXh.9$_G.8@.edtnps89...

Quote:

Originally Posted by

Thanks Erland,
>
I don't have a problem creating the T-SQL commands, it's using the
Enterprise Manager to do more than create dbs is where I get lost.


That's the problem with using EM.

BTW... the reason you're seeing the behavior you're seeing is that the
maintenance job takes the conservative approach and assumes that the most
recent backup HAS to succeed before it'll delete the older one.

So setting it to 22 hours or anything won't force it to delete the older
file until the new one is successfully created.

Quote:

Originally Posted by

>
I will give that a try (creating a back up job and scheduling)
>
THanks!
Rick
>
>
"Erland Sommarskog" <esquel@.sommarskog.sewrote in message
news:Xns991D5A7E496DYazorman@.127.0.0.1...

Quote:

Originally Posted by

>Rico (me@.you.com) writes:

Quote:

Originally Posted by

>>Thanks, but I should have mentioned that I'm using SQL 2000 and I'm not
>>using T-SQL, just trying to create the Maintenance plan from the
>>Enterprise Manager.


>>
>From what you described, you should not use a maintenance plan. Just set
>up
>a scheduled job to run the BACKUP statement. You can do this from the
>GUI where you backup databases, and select Schedule somewhere on a
>button.
>In the end you get a one-step job that has a BACKUP job.
>>
>Then again, if you have any interest in acquiring basic DBA skills, you
>should certainly learn to write basic BACKUP commands in T-SQL.
>>
>--
>Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
>>
>Books Online for SQL Server 2005 at
>http://www.microsoft.com/technet/pr...oads/books.mspx
>Books Online for SQL Server 2000 at
>http://www.microsoft.com/sql/prodin...ions/books.mspx


>
>


--
Greg Moore
SQL Server DBA Consulting Remote and Onsite available!
Email: sql (at) greenms.com http://www.greenms.com/sqlserver.html

Friday, March 23, 2012

have sql file and want to run it on an ftp connection

i have a db set up that i access using an ftp connection, and an sql script
file that i wish to run, how do i go about doing so?

tiastrawberry (1@.2.com) writes:
> i have a db set up that i access using an ftp connection, and an sql
> script file that i wish to run, how do i go about doing so?

You have an SQL Server database that you access through FTP? Maybe I'm
dense, but I don't understand how you could do that?

Of course, if your FTP client permits you to run remote scripts, you
could upload your script, and then run it with OSQL.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi

Why can you not get access to connect to the remote server, possibly through
a VPN?

John

"strawberry" <1@.2.com> wrote in message
news:exJ8f.19119$6i4.3749@.newsfe7-gui.ntli.net...
>i have a db set up that i access using an ftp connection, and an sql script
>file that i wish to run, how do i go about doing so?
> tia|||this is half the problem, I'm not sure how or where to execute the file I've
tried all sorts but suspect i don't have the correct privileges ( its a
university server)

thanks for your reply.

"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:436396ac$0$361$da0feed9@.news.zen.co.uk...
> Hi
> Why can you not get access to connect to the remote server, possibly
> through a VPN?
> John
> "strawberry" <1@.2.com> wrote in message
> news:exJ8f.19119$6i4.3749@.newsfe7-gui.ntli.net...
>>i have a db set up that i access using an ftp connection, and an sql
>>script file that i wish to run, how do i go about doing so?
>>
>> tia
>>|||its hosted on a linux server which has ftp access to it. what is osql?, and
where is the best place to upload it too ?

thanks for your reply

"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns96FE92704E4B5Yazorman@.127.0.0.1...
> strawberry (1@.2.com) writes:
>> i have a db set up that i access using an ftp connection, and an sql
>> script file that i wish to run, how do i go about doing so?
> You have an SQL Server database that you access through FTP? Maybe I'm
> dense, but I don't understand how you could do that?
> Of course, if your FTP client permits you to run remote scripts, you
> could upload your script, and then run it with OSQL.
>
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||strawberry (1@.2.com) writes:
> its hosted on a linux server which has ftp access to it. what is osql?,
> and where is the best place to upload it too ?

The database is on a Linux server? OK, then you are in the wrong newsgroup.
This newsgroup is about MS SQL Server, which runs only on Windows. (And
OSQL is a command-line tool to access MS SQL Server. On Windows.)

I recommend that you find out which DB engine you are using, and then
find the a forum that engine. Or even better ask the staff at the
university where the DB engine is.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||ahh, oppsy! . hadn't thought of that!

many thanks.

"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns96FF6A8A99F2Yazorman@.127.0.0.1...
> strawberry (1@.2.com) writes:
>> its hosted on a linux server which has ftp access to it. what is osql?,
>> and where is the best place to upload it too ?
> The database is on a Linux server? OK, then you are in the wrong
> newsgroup.
> This newsgroup is about MS SQL Server, which runs only on Windows. (And
> OSQL is a command-line tool to access MS SQL Server. On Windows.)
> I recommend that you find out which DB engine you are using, and then
> find the a forum that engine. Or even better ask the staff at the
> university where the DB engine is.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp

Have DTS Package prompt for a file name

I have a user in the IT department that wants a process to take his text file and import it into a SQL Server table. Simple enough with a DTS package.

The rub is that he want's to execute the package and have it prompt him at that point for where the file resides. I tried to get him to go into the DTS pacakge and update the connection, but he doesn't want to do it that way. I suggested renaming the file to a common name to be used each time the application runs, and he wasn't interested in that solution either.

Any help you can give would be greatly appreciated. Thanks!

Have a look at this:

http://www.sqldts.com/default.aspx?226

Have an .mdf file but can't seem to attach it.

We had a server that was having all kinds of problems. We finally got it
booted and I copied the .MDF file off the hard drive. I'm trying to create a
new database on a different machine using this .MDF file and it appears to
not work. I had shut down SQL on the machine before I copied the file but I
didn't detach it ( didn't have the time and I was trying to explain it to
somebody else. ) So now I have this .MDF file that I can't attach. I'm doing
the following command:
use master
go
CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
Server\MSSQL\Data\BugTracking.mdf') for attach
go
and I'm getting the following error:
Msg 1813, Level 16, State 2, Line 1
Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
Msg 602, Level 21, State 50, Line 1
Could not find row in sysindexes for database ID 23, object ID 1, index ID
1. Run DBCC CHECKTABLE on sysindexes.
I've also tried to a direct attach deleting the log file and it won't do it.
Anybody have any thoughts?
TIA - Jeff.
Can you try sp_attach_single_file_db and see if it works?
See:
http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx

Adam Machanic
SQL Server MVP
Author, "Expert SQL Server 2005 Development"
http://www.apress.com/book/bookDisplay.html?bID=10220
"Mufasa" <jb@.nowhere.com> wrote in message
news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
> We had a server that was having all kinds of problems. We finally got it
> booted and I copied the .MDF file off the hard drive. I'm trying to create
> a new database on a different machine using this .MDF file and it appears
> to not work. I had shut down SQL on the machine before I copied the file
> but I didn't detach it ( didn't have the time and I was trying to explain
> it to somebody else. ) So now I have this .MDF file that I can't attach.
> I'm doing the following command:
>
> use master
> go
> CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
> Server\MSSQL\Data\BugTracking.mdf') for attach
> go
>
> and I'm getting the following error:
>
> Msg 1813, Level 16, State 2, Line 1
> Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
> Msg 602, Level 21, State 50, Line 1
> Could not find row in sysindexes for database ID 23, object ID 1, index ID
> 1. Run DBCC CHECKTABLE on sysindexes.
>
> I've also tried to a direct attach deleting the log file and it won't do
> it.
> Anybody have any thoughts?
> TIA - Jeff.
>
|||Sorry everybody - I finally figured out what was wrong - I was trying to
attach the file from a 2005 db into 2000. Once I switched to another server,
it worked fine.
J.
"Adam Machanic" <amachanic@.IHATESPAMgmail.com> wrote in message
news:E95832E9-C1D4-4BC3-B010-6F7079F2AEAF@.microsoft.com...
> Can you try sp_attach_single_file_db and see if it works?
> See:
> http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx
>
> --
> Adam Machanic
> SQL Server MVP
> Author, "Expert SQL Server 2005 Development"
> http://www.apress.com/book/bookDisplay.html?bID=10220
>
> "Mufasa" <jb@.nowhere.com> wrote in message
> news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
>

Have an .mdf file but can't seem to attach it.

We had a server that was having all kinds of problems. We finally got it
booted and I copied the .MDF file off the hard drive. I'm trying to create a
new database on a different machine using this .MDF file and it appears to
not work. I had shut down SQL on the machine before I copied the file but I
didn't detach it ( didn't have the time and I was trying to explain it to
somebody else. ) So now I have this .MDF file that I can't attach. I'm doing
the following command:
use master
go
CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
Server\MSSQL\Data\BugTracking.mdf') for attach
go
and I'm getting the following error:
Msg 1813, Level 16, State 2, Line 1
Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
Msg 602, Level 21, State 50, Line 1
Could not find row in sysindexes for database ID 23, object ID 1, index ID
1. Run DBCC CHECKTABLE on sysindexes.
I've also tried to a direct attach deleting the log file and it won't do it.
Anybody have any thoughts?
TIA - Jeff.Can you try sp_attach_single_file_db and see if it works?
See:
http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx
Adam Machanic
SQL Server MVP
Author, "Expert SQL Server 2005 Development"
http://www.apress.com/book/bookDisplay.html?bID=10220
"Mufasa" <jb@.nowhere.com> wrote in message
news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
> We had a server that was having all kinds of problems. We finally got it
> booted and I copied the .MDF file off the hard drive. I'm trying to create
> a new database on a different machine using this .MDF file and it appears
> to not work. I had shut down SQL on the machine before I copied the file
> but I didn't detach it ( didn't have the time and I was trying to explain
> it to somebody else. ) So now I have this .MDF file that I can't attach.
> I'm doing the following command:
>
> use master
> go
> CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
> Server\MSSQL\Data\BugTracking.mdf') for attach
> go
>
> and I'm getting the following error:
>
> Msg 1813, Level 16, State 2, Line 1
> Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
> Msg 602, Level 21, State 50, Line 1
> Could not find row in sysindexes for database ID 23, object ID 1, index ID
> 1. Run DBCC CHECKTABLE on sysindexes.
>
> I've also tried to a direct attach deleting the log file and it won't do
> it.
> Anybody have any thoughts?
> TIA - Jeff.
>|||Sorry everybody - I finally figured out what was wrong - I was trying to
attach the file from a 2005 db into 2000. Once I switched to another server,
it worked fine.
J.
"Adam Machanic" <amachanic@.IHATESPAMgmail.com> wrote in message
news:E95832E9-C1D4-4BC3-B010-6F7079F2AEAF@.microsoft.com...
> Can you try sp_attach_single_file_db and see if it works?
> See:
> http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx
>
> --
> Adam Machanic
> SQL Server MVP
> Author, "Expert SQL Server 2005 Development"
> http://www.apress.com/book/bookDisplay.html?bID=10220
>
> "Mufasa" <jb@.nowhere.com> wrote in message
> news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
>

Have an .mdf file but can't seem to attach it.

We had a server that was having all kinds of problems. We finally got it
booted and I copied the .MDF file off the hard drive. I'm trying to create a
new database on a different machine using this .MDF file and it appears to
not work. I had shut down SQL on the machine before I copied the file but I
didn't detach it ( didn't have the time and I was trying to explain it to
somebody else. ) So now I have this .MDF file that I can't attach. I'm doing
the following command:
use master
go
CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
Server\MSSQL\Data\BugTracking.mdf') for attach
go
and I'm getting the following error:
Msg 1813, Level 16, State 2, Line 1
Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
Msg 602, Level 21, State 50, Line 1
Could not find row in sysindexes for database ID 23, object ID 1, index ID
1. Run DBCC CHECKTABLE on sysindexes.
I've also tried to a direct attach deleting the log file and it won't do it.
Anybody have any thoughts?
TIA - Jeff.Can you try sp_attach_single_file_db and see if it works?
See:
http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx
Adam Machanic
SQL Server MVP
Author, "Expert SQL Server 2005 Development"
http://www.apress.com/book/bookDisplay.html?bID=10220
"Mufasa" <jb@.nowhere.com> wrote in message
news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
> We had a server that was having all kinds of problems. We finally got it
> booted and I copied the .MDF file off the hard drive. I'm trying to create
> a new database on a different machine using this .MDF file and it appears
> to not work. I had shut down SQL on the machine before I copied the file
> but I didn't detach it ( didn't have the time and I was trying to explain
> it to somebody else. ) So now I have this .MDF file that I can't attach.
> I'm doing the following command:
>
> use master
> go
> CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
> Server\MSSQL\Data\BugTracking.mdf') for attach
> go
>
> and I'm getting the following error:
>
> Msg 1813, Level 16, State 2, Line 1
> Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
> Msg 602, Level 21, State 50, Line 1
> Could not find row in sysindexes for database ID 23, object ID 1, index ID
> 1. Run DBCC CHECKTABLE on sysindexes.
>
> I've also tried to a direct attach deleting the log file and it won't do
> it.
> Anybody have any thoughts?
> TIA - Jeff.
>|||Sorry everybody - I finally figured out what was wrong - I was trying to
attach the file from a 2005 db into 2000. Once I switched to another server,
it worked fine.
J.
"Adam Machanic" <amachanic@.IHATESPAMgmail.com> wrote in message
news:E95832E9-C1D4-4BC3-B010-6F7079F2AEAF@.microsoft.com...
> Can you try sp_attach_single_file_db and see if it works?
> See:
> http://msdn2.microsoft.com/en-us/library/ms174385(SQL.90).aspx
>
> --
> Adam Machanic
> SQL Server MVP
> Author, "Expert SQL Server 2005 Development"
> http://www.apress.com/book/bookDisplay.html?bID=10220
>
> "Mufasa" <jb@.nowhere.com> wrote in message
> news:eH$ODo1sHHA.840@.TK2MSFTNGP03.phx.gbl...
>> We had a server that was having all kinds of problems. We finally got it
>> booted and I copied the .MDF file off the hard drive. I'm trying to
>> create a new database on a different machine using this .MDF file and it
>> appears to not work. I had shut down SQL on the machine before I copied
>> the file but I didn't detach it ( didn't have the time and I was trying
>> to explain it to somebody else. ) So now I have this .MDF file that I
>> can't attach. I'm doing the following command:
>>
>> use master
>> go
>> CREATE DATABASE BugTracking on (Filename='D:\Program Files\Microsoft SQL
>> Server\MSSQL\Data\BugTracking.mdf') for attach
>> go
>>
>> and I'm getting the following error:
>>
>> Msg 1813, Level 16, State 2, Line 1
>> Could not open new database 'BugTracking'. CREATE DATABASE is aborted.
>> Msg 602, Level 21, State 50, Line 1
>> Could not find row in sysindexes for database ID 23, object ID 1, index
>> ID 1. Run DBCC CHECKTABLE on sysindexes.
>>
>> I've also tried to a direct attach deleting the log file and it won't do
>> it.
>> Anybody have any thoughts?
>> TIA - Jeff.
>>
>

have a problem

Who knows how to fix this..
who knows how to fix this
> MMC cannot open the file C:\Program Files\Microsoft SQL
Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
This may be because the file doesnt exist, is not an MMC console, or
was created by a later version of MMC. This may also be because
Thx
Vegita
---
Posted via http://www.mcse.ms
---
View this thread: http://www.mcse.ms/message412335.htmlHi Vegita.
The error indicates one of a few things is the problem - have you checked
each of them? Can you offer us anything more to work with, such as:
(a) Does the file actually exist?
(b) Have you upgraded MMC.exe? (added new service packs lately etc)
(c) Have you uninstalled the SQL client tools perhaps?
A simple catch-all approach to solving the problem might be to simply
re-install the SQL Server Client Tools (not the server) which will probably
get you out of jail..
Regards,
Greg Linwood
SQL Server MVP
"Vegita" <Vegita.11xyqq@.mail.mcse.ms> wrote in message
news:Vegita.11xyqq@.mail.mcse.ms...
> Who knows how to fix this..
> who knows how to fix this
> > MMC cannot open the file C:\Program Files\Microsoft SQL
> Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
> This may be because the file doesnt exist, is not an MMC console, or
> was created by a later version of MMC. This may also be because
> Thx
>
> Vegita
> ---
> Posted via http://www.mcse.ms
> ---
> View this thread: http://www.mcse.ms/message412335.html
>

have a problem

Who knows how to fix this..
who knows how to fix this
> MMC cannot open the file C:\Program Files\Microsoft SQL
Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
This may be because the file doesnt exist, is not an MMC console, or
was created by a later version of MMC. This may also be because
Thx
Vegita
---
Posted via http://www.mcse.ms
---
View this thread: http://www.mcse.ms/message412335.htmlHi Vegita.
The error indicates one of a few things is the problem - have you checked
each of them? Can you offer us anything more to work with, such as:
(a) Does the file actually exist?
(b) Have you upgraded MMC.exe? (added new service packs lately etc)
(c) Have you uninstalled the SQL client tools perhaps?
A simple catch-all approach to solving the problem might be to simply
re-install the SQL Server Client Tools (not the server) which will probably
get you out of jail..
Regards,
Greg Linwood
SQL Server MVP
"Vegita" <Vegita.11xyqq@.mail.mcse.ms> wrote in message
news:Vegita.11xyqq@.mail.mcse.ms...
> Who knows how to fix this..
> who knows how to fix this
> Server\80\Tools\BINN\Sql Server Enterprise Manager.MSC.
> This may be because the file doesnt exist, is not an MMC console, or
> was created by a later version of MMC. This may also be because
> Thx
>
> Vegita
> ---
> Posted via http://www.mcse.ms
> ---
> View this thread: http://www.mcse.ms/message412335.html
>

Wednesday, March 21, 2012

Has anyone tried...

I currently have a DTS package that takes a text file source and transforms the data into a table. The transformation does a lookup on a product_code column (char(8)) to transform it to the correct product_id (int) for our system.

I've recently setup a view that matches the file layout and has an instead of trigger that inserts into the target table. It does this by joining the inserted table to a translation table to get the correct product_id for the insert. Instead of a DTS package I created a proc that BULK INSERTS the file into the view.

The view approach is running over twice as fast as the DTS package. Is this approach common or has anyone else tried it. Any feedback would be appreciated.I don't think it is uncommon to see that kind of performance gain. DTS is very good at doing complex things, but it can't compete with BULK INSERT or BCP for doing simple things. The trick is to figure out which tool is best for a given job.

-PatP|||I forgot to mention the file contains 9.9 million records with 7 columns. The DTS package runs in 20 minutes vs. 8 for the view. BCP'ing the file directly into a table with no translation takes 3 minutes.|||DTS is very good at doing complex things...
-PatP

I'm still waiting for my burger and fries....

There's is NOTHING that will beat bcp load to staging table and set process t-sql to fix whatever it is you need to fix...especially if the file to be loaded is in native format...|||Try loading a Notes log file into SQL Server doing LDAP lookups for the VPN derived data. DTS can do it nicely using the API, BCP can't get there from here without something that can at least produce a file first, and nothing I've seen will do that very well.

The jobs that BCP can do, it does VERY well. I don't think that anything can be faster than BCP. The jobs that BCP can't do... it can't do.

-PatP

Monday, March 19, 2012

has anyone done something like this

I have to write an app the will take a csv file and parse the results into sql server. Not a big deal really but I need to be able to work on the data and return my own values for sql server depending on the value of data. Ideally it would be nice if i could create this an a executable. Does anyone know of any good tutorials that describe how to do this.

THanksHere's how to read CSV files using .NET:
http://weblogs.asp.net/fmarguerie/archive/2003/10/01/29964.aspx

You could read it into a DataSet and work with it from there.

Wednesday, March 7, 2012

Hard time master log files

Hi,
I have a small DB, about 200MB, but my transaction log files grows regulary
over 900MB.
The only way I have been able to manage the log file is to detach the
database, delete the log file and reattach the DB.
Is there any better way to manage growing log files?
Thanks in advance
ChristianHi,
Check the recovery model for your database using the below command:-
Select DATABASEPROPERTYEX(<dbname>,Recovery )
If the recovery is Simple then the LDF should grow big intil u have some
batch operation. In that once the trasnaction is over you could
shrink the LDF file using DBCC SHRINKFILE command
If the recovery model is either FULL or BULK_LOGGED then you need to do the
Transaction log backup using BACKUP LOG command. This
will keep the LDF file not to grow.
Thanks
Hari
SQL Server MVP
"Christian Perthen" <abracadabara@.dontreplytothidress.com> wrote in
message news:e2YgKk8QFHA.3928@.TK2MSFTNGP09.phx.gbl...
> Hi,
> I have a small DB, about 200MB, but my transaction log files grows
> regulary
> over 900MB.
> The only way I have been able to manage the log file is to detach the
> database, delete the log file and reattach the DB.
> Is there any better way to manage growing log files?
> Thanks in advance
> Christian
>

Monday, February 27, 2012

Handling SQL Exception

I'm unsure how to handle an SQL Exception correctly when the database is unavailable/offline.

I have my aspx file with the C# code-behind, but all of the SQL stuff is done in a separate code file in the App_Code directory.
E.g.

CODE-BEHIND
DatabaseModifier.deleteUser(username);

DATABASEMODIFIER.cs
public static void deleteUser(string username)
{
SqlConnection conn = SqlLogin.SqlConnect;
SqlCommand command = new SqlCommand("DELETE FROM <table> WHERE Username = '" + username + "'", conn);
conn.Open()
command.ExecuteNonQuery();
conn.Close()
}

Now, that code works perfectly, however if the database I'm connecting to is offline, an SQLException is thrown and because the SQL is handled in my DatabaseModifier class, I'm not sure how to handle it correctly.
If I use a Try/Catch block in my code-behind, it doesn't get thrown because the error occurs in my DatabaseModifier class. If I use a Try/Catch block in my DatabaseModifier class, what can I put in the catch block that will inform the user of the database being offline and/or how can I perform a url redirection?

Any help is greatly appreciated.

You can write the connection open code in the database modifier class with try-catch block, and in the catch block you can just throw the exception being written. Again the the page code behind you can write the database activity in a try-catch block, and here in this catch you'll receive any exception which may be generated in the database modifier class. As the code in the database modifier class does nothing but to throw the exception, you can handle them in the page code.

Hope this will help.

|||

So in the DatabaseModifier class:
Try
{
code
}
Catch(SqlException)
{
//nothing in here
}

In code behind:
Try
{
code
}
Catch(SqlException e)
{
//output e.Message to user?
}

|||

Something like that:

In database modifier:Try{ code}Catch(SqlException exp){throw exp;}In code behind:Try{ code}Catch(SqlException e){//output e.Message to user?}

Now, you can see that the db modifier just throws any exceptions which it receives during any operation (this is fair also because you don't have any user out put system set in the db modifier class), and the page code is set up to handle the exceptions thrown by the db modifier as well as any other exceptions during the code execution of its own.

Hope this will help