Showing posts with label based. Show all posts
Showing posts with label based. Show all posts

Friday, March 30, 2012

having problems with details selection

i have a packing slip report with 2 detail sections. can i select which details section to display/print as the line item based on whether the line item is a stock item (orderitem.type) =1 or a miscellaneous item (orderitem.type) = 3 ?? i don't want to use the suppress method, it needs to be one section or the other. thanksWhat's wrong with conditionally suppressing each detail sub-section as required? Would seem to be exactly what you want.

Wednesday, March 28, 2012

Having Problem with Report Parameters

i am trying to generate a report based on 3 parameters

age, location, ethnciity

every thing works fine in data and layout tab,

when i run the preview tab, it give me the option to input paramaters and then when i hit veiw report, it shows processing report.... (indefinite) time.

i tried executing the query in data tab, it takes less than a sec.

any ideas?

am i doing somethign wrong in parameters?

Raja,

I'm going on the assumption you are using SQL as your datasource, If so, you can start a SQL profiler on the database you're using.

In the profiler, you can watch the query statement from SQL reporting service being executed, It will reveal exactly what is being passed in your parameter. You can copy this statement and execute to a new query to help you debug your issue.

I hope that this helps.

Ham

|||

its some weard problem.

its hitting the dbase and i double checked it in the profiler.

i dont knows what happening. i deleted the data cache files in the folder and tried rerun again..

nothing works.

data tab works 1000 times fine.

|||

What happens when you take the statement from the profiler and run it in a new query window. Does that execution of the statement still takes a long time?

Ham

|||

Hi,

I once had sort of the same problem. I didn't set the datatype of the parameters correctly which caused problems when viewing the report.

Greetz,

Geert

Geert Verhoeven
Consultant @. Ausy Belgium

My Personal Blog

|||

Yes,

I checked it with the QAnalyzer, Profiler, ReportingServices Data Tab.

everything takes less than a sec to execute the 15000 records.

This is one thing i noticed now after some weird research.

when i hit the preview tab and input parameters its not showing anything (i mean it shows report processinng..... )

but now when i deploy it to the server and try to view it. ITS WORKING

Where the h... is the problem. ?

|||any one have an email id of any of the webcast instructors who have instructed for RS before...|||any help?

Monday, March 19, 2012

Has anyone here been both an Oracle and SQL Server DBA?

If so, is the role different based on the platform? More stress/ hours in
one than the other? Is the work a lot different between the two? Which did
you enjoy more? Why?
TIA, ChrisRI have not been both, but I have worked as a data architect in both.
Oracle DBAs are generally specialist. They tend to specialize in a
specific area of Oracle. If Oracle is in Unix, which in the corporate
arena, is all you will see, there will be DBAs that specialize in shell
scripting. Others will specialize in PL/SQL. Others in backup and
maintenance. In the SQL Server world, the DBA does it all because it
is not near as complex as Oracle has made it.
Having worked with both databases, I can say that SQL Server is much
easier to work with than Oracle. But, I also have worked with SQL
Server since 3.21 and Oracle since 8i. So many more years with SQL
Server.
ChrisR wrote:
> If so, is the role different based on the platform? More stress/ hours in
> one than the other? Is the work a lot different between the two? Which did
> you enjoy more? Why?
> TIA, ChrisR|||I was an Informix DBA for 15 years (and SQL Server for 1) and have known a
couple Oracle DBA's.
Oracle is much more complicated to admin that SQL Server, or Informix.
However, that shouldn't have anything to do with stress level. However, since
there are usually several Oracle DBA's to 1, or 2 SQL Server DBA's they tend
to know less about what is happening (big picture) which may lead to greater
stress. I havn't see it, but could be.
"ChrisR" wrote:
> If so, is the role different based on the platform? More stress/ hours in
> one than the other? Is the work a lot different between the two? Which did
> you enjoy more? Why?
> TIA, ChrisR
>
>|||ChrisR wrote:
I've been both an Oracle DBA and a Microsoft SQL Server DBA.
> If so, is the role different based on the platform?
No. The roles aren't different based upon the platform. They're different
based upon whether you're a production DBA or a development DBA. As a
production DBA, you're responsible for high availability, security,
performance tuning, business continuity, data migrations, and so on. As a
development DBA, you're still the caretaker of the database, but you're also
often responsible for designing the architecture of new databases, helping
application developers with code, and so on. If you're a development DBA,
you'd better be an excellent coder because everyone comes to you when their
code doesn't work.
> More stress/ hours in
> one than the other?
Although Oracle is more complex, I had less stress and overtime as an Oracle
DBA. All the Oracle instances were designed for high availability. A lot
more money and effort went into the Oracle platform than the SQL Server
platform, and it showed. The Oracle platform was rock solid, and most of the
Oracle servers had been running continuously for three to six years. Not so
with SQL Server. Nearly every time a patch came out from Microsoft for
either the Windows servers or SQL Server, we'd have to reboot afterwards.
Oracle and Unix are designed so that installing patches doesn't require a
reboot except in the most extreme circumstances. And an ill-designed query
by inexperienced users could bring SQL Server 7 or 2000 to its knees, but
Oracle 7, 8i, or 9i kept chugging along.
The keys to having the least amount of stress and overtime as a DBA are
training, planning, and preparation.
> Is the work a lot different between the two?
You're the caretaker for the organization's data. You have a lot of
responsibility on your shoulders, whether as a SQL Server DBA or an Oracle
DBA. Oracle is more complex, so requires more training and OJT to reach the
level of "experienced." SQL Server is designed to work right out of the box
and requires less work administering, but the most recent versions of Oracle
are designed such that one could install it using the default settings and
get an Oracle instance up and running with almost no training. However,
using all the defaults will lead to performance problems and inevitable
disasters for the inexperienced Oracle DBA.
> Which did
> you enjoy more? Why?
I thoroughly enjoyed both of them, but probably Oracle a little more. With
Oracle I worked side by side with Oracle DBA's with 15 to 25 years experience,
so any problem that erupted had an expert who had an instant solution to fix
it. Excellent training ground for new DBA's, as I was at the time. I
learned lots of new things every day with Oracle, but I learned new things
every day with SQL Server, too. The SQL Server DBA's I learned from had
between 1 and four years experience, so they didn't go into the same depth as
the DBA's with decades of experience. SQL Server was more exciting because
more things went wrong. (It wasn't on as robust a platform as our Oracle
instances.)
--
Message posted via SQLMonster.com
http://www.sqlmonster.com/Uwe/Forums.aspx/sql-server/200612/1

Friday, March 9, 2012

Hardware considerations for Report Server?

Hi all,
we use SSRS based on SSAS Cubes.
We thought about installing SSAS on one server, the relational data on an
other SQL Server and put the Report Server with the IIS 6.0 on a third
machine.
As we haven't any experiences with load and performance for BI-Applications
I'm wondering how many memory we should buy for each server.
The idea of our system service is the following:
32 GB for the analysis server
16 GB for the relational database server
4 GB for the web and reporting server
I have the impression that the 4 GB for the web/reporting server is too few,
but memory for the two others sounds pretty good.
What do you say?
Thanks a lot for any answers...
Cheers
MarcYou didn't talk about the expected load on the server and the number of
CPU's. My experience is that 4 GB is fine but I would want multiple CPUs. If
you design the way you should (i.e. limit the data at the source and limit
the report to what the user needs to see) then that configuration will be
pretty darn good. I would look at 4 CPU (but again, it depends on your
load).
Without knowing the load (expected number of reports per hour for instance),
it is really hard to say.
--
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"Marc" <Marc@.discussions.microsoft.com> wrote in message
news:17E86835-FF86-429F-8424-A7BE431093B1@.microsoft.com...
> Hi all,
> we use SSRS based on SSAS Cubes.
> We thought about installing SSAS on one server, the relational data on an
> other SQL Server and put the Report Server with the IIS 6.0 on a third
> machine.
> As we haven't any experiences with load and performance for
> BI-Applications
> I'm wondering how many memory we should buy for each server.
> The idea of our system service is the following:
> 32 GB for the analysis server
> 16 GB for the relational database server
> 4 GB for the web and reporting server
> I have the impression that the 4 GB for the web/reporting server is too
> few,
> but memory for the two others sounds pretty good.
>
> What do you say?
> Thanks a lot for any answers...
>
> Cheers
> Marc

Monday, February 27, 2012

Handling very large XML result sets

I am writing a .NET based application to create large XML data files using
SQLXML classes and FOR XML EXPLICIT queries. What are some strategies I can
use to break up and process these large result sets? The overhead of issuing
multiple queries by breaking them up via WHERE clause filters isn’t the way I
want to go since my queries are very large and take significant time to
process within SQL server.
I am currently experiencing out of memory exceptions on some larger result
sets (~50-60 Mbytes total XML file size). My first attempt was using
SqlXmlCommand.ExecuteXmlReader and an XmlDocument via this snippet of code:
XmlReader xr = forXMLCommand.ExecuteXmlReader();
XmlDocument xd = new XmlDocument();
xd.Load(xr);
This throws a System.OutOfMemoryException on the call to ExecuteXmlReader
when the result set gets very large.
I also tried using SqlXmlCommand.ExecuteStream thinking I could read a
buffer of chars at a time to process these large result sets but this also
resulted in a System.OutOfMemoryException on the call to ExecuteStream:
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.Peek() >= 0)
{
c = new char[10000];
intnumRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
I have tried running my application on two different systems one with 1G of
main memory and the other a Win2K3 server with 8G of main memory. Both
systems seem to run out of memory at the same 50-60 Mb limit) Are there any
..NET memory settings I can tweak to give my .NET application more memory?
Thanks for your suggestions and ideas,
Scott
The XmlReader is a streaming interface which should not run out of memory
via the SqlXmlCommand.ExecuteStream method.
Loading into an XmlDocument however will cache the entire document into
memory.
Can you remove the following two lines from your repro and see if you are
still having the problem:
XmlDocument xd = new XmlDocument();
xd.Load(xr);
Thanks -
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad
|||Andrew,
That was exactly my thought as well, but ExecuteStream is throwing an
OutOfMemoryException. I am NOTcalling XmlDocument .Load in the code that
uses ExecuteStream.
Here is my full method I am using:
private void ExecuteSQLXMLCommandExecuteStream()
{
try
{
SqlXmlCommandforXMLCommand = new SqlXmlCommand("Provider=SQLOLEDB;DATA
SOURCE=Gibraltar;Initial Catalog=RDCModel;User ID=sa;Password=XXXX");
forXMLCommand.CommandType = SqlXmlCommandType.Sql;
StreamReadersr1 = new StreamReader(mQueryFileName);
stringquery = sr1.ReadToEnd();
sr1.Close();
query = query.Replace("\r\n", " ");
query = query.Replace("\t", " ");
forXMLCommand.CommandText = query;
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.Peek() >= 0)
{
c = new char[10000];
intnumRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
sw3.Close();
}
catch (SqlXmlException ex)
{
ex.ErrorStream.Position = 0;
string sqlErrorString;
sqlErrorString = new StreamReader(ex.ErrorStream).ReadToEnd();
Console.WriteLine(sqlErrorString);
RDCUtilities.WriteToLog(sqlErrorString);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
RDCUtilities.WriteToLog(ex.Message);
}
""Andrew Conrad"" wrote:

> The XmlReader is a streaming interface which should not run out of memory
> via the SqlXmlCommand.ExecuteStream method.
> Loading into an XmlDocument however will cache the entire document into
> memory.
> Can you remove the following two lines from your repro and see if you are
> still having the problem:
> XmlDocument xd = new XmlDocument();
> xd.Load(xr);
> Thanks -
> Andrew Conrad
> Microsoft Corp
> http://blogs.msdn.com/aconrad
>
|||Try using SqlXmlCommand.ExecuteToStream() instead of ExecuteStream.
Because of some technical limitations with COM interop, ExecuteStream
caches results.
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad

Handling very large XML result sets

I am writing a .NET based application to create large XML data files using
SQLXML classes and FOR XML EXPLICIT queries. What are some strategies I can
use to break up and process these large result sets? The overhead of issuin
g
multiple queries by breaking them up via WHERE clause filters isn’t the wa
y I
want to go since my queries are very large and take significant time to
process within SQL server.
I am currently experiencing out of memory exceptions on some larger result
sets (~50-60 Mbytes total XML file size). My first attempt was using
SqlXmlCommand.ExecuteXmlReader and an XmlDocument via this snippet of code:
XmlReader xr = forXMLCommand.ExecuteXmlReader();
XmlDocument xd = new XmlDocument();
xd.Load(xr);
This throws a System.OutOfMemoryException on the call to ExecuteXmlReader
when the result set gets very large.
I also tried using SqlXmlCommand.ExecuteStream thinking I could read a
buffer of chars at a time to process these large result sets but this also
resulted in a System.OutOfMemoryException on the call to ExecuteStream:
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.P() >= 0)
{
c = new char[10000];
int numRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
I have tried running my application on two different systems one with 1G of
main memory and the other a Win2K3 server with 8G of main memory. Both
systems seem to run out of memory at the same 50-60 Mb limit) Are there any
.NET memory settings I can tweak to give my .NET application more memory?
Thanks for your suggestions and ideas,
ScottThe XmlReader is a streaming interface which should not run out of memory
via the SqlXmlCommand.ExecuteStream method.
Loading into an XmlDocument however will cache the entire document into
memory.
Can you remove the following two lines from your repro and see if you are
still having the problem:
XmlDocument xd = new XmlDocument();
xd.Load(xr);
Thanks -
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad|||Andrew,
That was exactly my thought as well, but ExecuteStream is throwing an
OutOfMemoryException. I am NOTcalling XmlDocument .Load in the code that
uses ExecuteStream.
Here is my full method I am using:
private void ExecuteSQLXMLCommandExecuteStream()
{
try
{
SqlXmlCommand forXMLCommand = new SqlXmlCommand("Provider=SQLOLEDB;DATA
SOURCE=Gibraltar;Initial Catalog=RDCModel;User ID=sa;Password=XXXX");
forXMLCommand.CommandType = SqlXmlCommandType.Sql;
StreamReader sr1 = new StreamReader(mQueryFileName);
string query = sr1.ReadToEnd();
sr1.Close();
query = query.Replace("\r\n", " ");
query = query.Replace("\t", " ");
forXMLCommand.CommandText = query;
Stream s = forXMLCommand.ExecuteStream();
StreamWriter sw3 = new StreamWriter(mResultsFileName);
using (StreamReader sr = new StreamReader(s))
{
char[] c = null;
while (sr.P() >= 0)
{
c = new char[10000];
int numRead = sr.Read(c, 0, c.Length);
sw3.Write(c, 0, numRead);
}
}
sw3.Close();
}
catch (SqlXmlException ex)
{
ex.ErrorStream.Position = 0;
string sqlErrorString;
sqlErrorString = new StreamReader(ex.ErrorStream).ReadToEnd();
Console.WriteLine(sqlErrorString);
RDCUtilities.WriteToLog(sqlErrorString);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
Console.WriteLine(ex.StackTrace);
RDCUtilities.WriteToLog(ex.Message);
}
""Andrew Conrad"" wrote:

> The XmlReader is a streaming interface which should not run out of memory
> via the SqlXmlCommand.ExecuteStream method.
> Loading into an XmlDocument however will cache the entire document into
> memory.
> Can you remove the following two lines from your repro and see if you are
> still having the problem:
> XmlDocument xd = new XmlDocument();
> xd.Load(xr);
> Thanks -
> Andrew Conrad
> Microsoft Corp
> http://blogs.msdn.com/aconrad
>|||Try using SqlXmlCommand.ExecuteToStream() instead of ExecuteStream.
Because of some technical limitations with COM interop, ExecuteStream
caches results.
Andrew Conrad
Microsoft Corp
http://blogs.msdn.com/aconrad