Salman (Webmaster)
Joe Sack
Paschal L.

Most Recent Posts

SQL Server 2000 White Papers

Looking for SQL Server 2000 White Papers?  You olymp trade login entrar can find lots of them here:


[ read more...]

--Joseph Sack, 6/11/2004 9:05:00 AM

Query members of database roles

Here is a query I dug up from sp_helprolemember to show a list of database role members.  I cleaned it up to use INNER JOINs instead of commas/old style joins.

SELECT  DbRole = g.name,
 MemberName = u.name
FROM  sysmembers m
INNER JOIN sysusers g ON
 g.uid = m.groupuid
INNER JOIN sysusers u ON
 u.uid = m.memberuid
WHERE g.issqlrole = 1

[ read more...]

--Joseph Sack, 6/8/2004 3:55:00 PM

sql_variant Question - does it impact performance?

Today I was asked by a co-worker what olymptrade.net.br/download/ value there is to using the SQL Server 2000 sql_variant data type.

From what I've researched, there is some conflicting information on it's use.  For example, the following SQL-Server-Performance article argues strongly against using it:

http://www.sql-server-performance.com/datatypes.asp  (search on sql_variant)

SQL BOL mentions some interesting effects when you are comparing sql_variant data type columns:

"These rules apply to sql_variant comparisons:

  • When sql_variant values of different base data types are compared, and the base data types are in different data type families, the value whose data type family is higher in the hierarchy chart is considered the higher of the two values.

  • When sql_variant values of different base data types are compared, and the base data types are in the same data type family, the value whose base data type is lower in the hierarchy chart is implicitly converted to the other data type and the comparison is then made.

  • When sql_variant values of the char, varchar, nchar, or varchar data types are compared, they are evaluated based on the following criteria: LCID, LCID version, comparison flags, and sort ID.  Each of these criteria are compared as integer values, and in the order listed. "

What's more, the following on http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnpag/html/scalenetchapt14.asp praises sql_variant for it's flexibilty, but does state that using the data type affects performance due to addition data type conversion.

[ read more...]

--Joseph Sack, 6/8/2004 9:48:00 AM

DTC Service for Windows 2003 and SQL Server

I recently installed SQL Server 2000 on Windows 2003 Server, and found that although the DTC service was running, it was not enabled to communicate over the network.

I discovered this when attempting to http://www.olymptrade.net.br/download/ run a distributed query, accessing a linked server.  I received the following error message:

Server: Msg 7391, Level 16, State 1, Procedure  Line 39
The operation could not be performed because the OLE DB provider 'SQLOLEDB' was unable to begin a distributed transaction.
[OLE/DB provider returned message: New transaction cannot enlist in the specified transaction coordinator. ]
OLE DB error trace [OLE/DB Provider 'SQLOLEDB' ITransactionJoin::JoinTransaction returned 0x8004d00a].

Looks like for 2003, DTC is secured out-of-the-box.  Communication via the network must be configured manually... Here is a link to the Microsoft Support document, giving instructions on how to do this:


[ read more...]

--Joseph Sack, 4/14/2004 1:59:00 PM

Intel Hyper-Threading with SQL Server

A developer I know forwarded me this FAQ from Microsoft:


It is actually good news for those with Intel machines that use hyper-threading technology.  Microsoft does not bill you per logical processor - only physical... Also good news: those with SQL Server 2000 standard edition and SP3 - can take advantage of the extra logical processors (you don't need Enterprise Edition).

For example, on a 4-way, SQL Server 2000 Standard Edition SP3, your SQL Server instance can use the 8 logical CPUs.

[ read more...]

--Joseph Sack, 2/2/2004 4:18:00 PM

Remove curly brackets from the uniqueidentifier data type in DTS

If you have ever exported data using DTS from SQL Server to a Text File, you may have noticed that SQL Server sometimes exports uniqueidentifier data types with curly brackets.  These are the "{" or "}" characters. 

To remove these brackets from your file, cast the source  column reference in the view or query you are using to export the data.  For example:

CAST(MyUniqueID as varchar(36)) as 'MyIDWithoutBrackets'

The ADO stream will recognize this as a different data type other than uniqueidentifier, and will send it without the brackets.


[ read more...]

--Joseph Sack, 12/22/2003 2:09:00 PM

DTS Export bug from SQL Server to a Text File using the Transform Data Task

Happy Holidays.  Here is a little discovery I made today:

The Scenario:

I was in DTS Designer, using a Transform Data task from SQL Server 2000 to a Text file destination.  The file was to be populated via a View.  This view had large columns, some up to 2000 characters in width.

On the "Source" tab of the Transform Data Task, I selected the view to export.  No problem.

On the "Destination" tab of the Transform Data Task, the "Define Columns" dialog box appeared with a list of the table columns (name, type, size, and Binary checkbox).  I selected the "Execute" button to populate the text file meta-data.

Instead of seeing my new columns in the Destination tab - it was blank.  When I tried again, Enterprise Manager shut down with a "memory read" error.

The Workaround:

Anyhow - I searched around for a reason, and it seems my situation is a bug.  Turns out that the default value for columns in a text file for SQL Server 2000 is 255.  I found the following article that described the issue:




Anyhow - I used DTS Wizard to export the file instead, making sure to save the Wizard package afterwards (so I could modify to my liking).  After doing so, the export worked as it should.

[ read more...]

--Joseph Sack, 12/12/2003 9:03:00 PM

Cross-Database Ownership Chaining - do not forget about the database owner

With Service Pack 3 - we were introduced to new settings for cross-database ownership chaining.

The scenario:

You have a database filled with stored procedures which perform SELECT operations against multiple databases.  Pre SP3 - you granted rights to users for execution of these stored procedures.  Everything works.

After installing SP3 - your users start seeing "SELECT permission denied on such and such".  You check to see if cross-database ownership chaining is enabled.  After doing so, you STILL get the "permission denied" error.

The answer (for this scenario) - check the database owner!  The database owner (for example, "sa") of the database containing the called procs, must also be the owner of the databases where those procs are performing SELECTs (or other) operations against.


[ read more...]

--Joseph Sack, 9/6/2003 7:38:00 AM

Autoupdate SQL Server

Microsoft seems to push the idea on automatic updates for SQL server and Yukon.

I disagree fully with this idea. I give many reasons for this and this is a compilation of my thoughts and comments I received or posts I read.

I would like to know what is the attitude of other DBAs on the subject.

Randy Holloway  post this:

This idea is something that I can get behind. Vulnerability in your database servers can be very dangerous to your business, and I think that if you run enterprise software like Microsoft's SQL Server and rely on their platform, it may be a good idea to plug into their auto-updates system. I'm looking forward to reading more about how this feature will work.

I like the idea too, but it's totally unfeasible.

Like many others, my SQL box is behind two firewalls, and only visible from a live server, with TCP filtering active, and Netbios service disabled.

So how Microsoft will be able to update the box automatically ?!??

And no way that I open my network to allow this, the risks are too important.


So sadly unrealistic :-(



[ read more...]

--Paschall, 8/25/2003 11:50:00 AM

Query Analyzer Server Connections

Whenever you make a connection to a server using QA (query analyzer) the editor maintains a history of all past connections.

If you are not clear of what I'm referring to, simply open up QA and go to File -> Connect.  You will see a list of all the SQL Servers you have connected to in the past.

I wanted to delete the entries in the list but I couldn't find any GUI interface to do it.  Thanks to a tip by David P. I was able to do it.

Since this is a registry modification, do this at your own risk!

Microsoft SQL Server\
and delete the matching User/Server pair.

The things you learn at http://sqlblogs.com !

[ read more...]

--Salman Ahmed, 8/1/2003 12:58:00 AM

Yukon tools to be added in SQL 2000

Microsoft said next month it will release more information about a reporting-service feature originally designed for Yukon database software, which will be available for SQL Server 2000 by the end of the year.

About 60 customers are currently testing SQL Server Reporting Services, though more than 100 others applied to be a part of the private beta, said Tom Rizzo, a Microsoft group product manager.

Microsoft will be releasing more data about how SQL Server Reporting Services works, including some technical white papers and developer walk-throughs, Rizzo said. The public beta is scheduled for the fall and the service will be available by the end of the year; included with the SQL Server license.

Interest in this service is high. It was first discussed at Microsoft's TechEd conference in early June. "People are really asking a lot of questions," said Kurt Windisch, director of program development at Levi, Ray and Shoup Inc. in Springfield, Ill.

"Reports are the crux of an organization," Windisch said. "That's what the decision makers want to see, and a technical person wants to please the decision maker."

The software offers several key services for IT administrators and for database administrators, Rizzo said.

It has graphical design tools for reports. "Instead of code, you can drag and drop in Visual Studio to create reports in a graphical environment," Rizzo said. Customers today have a few different options in this regard. They can create their own reports or use a third-party application, such as the one made by Crystal Decisions in Palo Alto, Calif.

source: Bink.nu


[ read more...]

--Paschall, 7/22/2003 1:12:00 PM

Future of Data languages

Some discussion happened from a post I have on my .Net blogging stuff about the future of languages.

I initiated my post after reading
Angelika Langer.

I post here my comments here because I believe that the future will be certainly on some data manipulation.

Like many developers, all my projects are database related. Whatsoever, it's always the same and often boring routine of creating  a bucket of Create, Update, Delete functions.

I use of course some SQL helpers but I still have to write some code there.

I think for me the future could be a full automation of this process.

I heard Yukon has some promising ideas on this subject.

I imagine for example a Clients database with some methods calling those basic functions:

Client.Address = "Seattle"
Client.FirstName ="Bill"

The language will be 'smart' enough to check if the entry is already there, adding or updating the record if necessary.

To delete, something like:

Client.Delete.Id.Withconfirmation = 1545

OK as I said before I can do that today with some code. But what about autodiscovery ?

If  I create a new database, why not having my code discover the new database.

And to push further, why not creating the database from my code ?

I will write something like:

Products.Name = "Soap"
Products.Id = 151
Products.Price = $500

And the fields will

[ read more...]

--Paschall, 7/22/2003 11:01:00 AM

A Funny Thing Happened on the Way to the Connection Pool...

To get to the gist of this post skip to the last bottom paragraph : )

So I'm working on filtering out data access specific exceptions from my code. I started out with a simple table and and a few stored procedures so I can fake some common data exceptions (i.e. primary key violations, constraint violations, etc...) Then I create a sample aspx page to generate the exceptions (if there is one thing I know how to do it's create a page that throws exceptions.)

My page consists of a SqlConnection, a DataGrid, and a SqlDataReader to populate the grid. This is a pretty common setup for me as I have only recently acquired the n-tier faith.

For grins I started by pausing my SQL Server to keep the page from querying the table and hence, throw an exception. As planned I got the exception, but not on the line I anticipated. I was planning on seeing the exception at the line with the SqlConnection.Open method call. Instead I see it on the line just below, where my data reader is executed.

using(SqlDataReader rdr = cmd.ExecuteReader())

This comes as a suprise to me. How did the code make it past the Open method if the SQL Server was paused? Changing the state of SQL Server from "Paused" to "Stopped" resulted in the same exception.

General Network Error. Check your network documentation.

I then changed the password in my connection string. I've seen in the past a bad connection string will throw an exception at the Open method of the SqlConnection object (either that the server is paused or the server does not exist/access denied depending on the state of the server.) I wanted to see if my code would make it past the Open method. Sure enough the code never made it to the data reader. So I figured it probably has something to do with the connection pooling. If the connection string is good the Open method grabs a connectio

[ read more...]

--Justin Bigelow, 7/19/2003 10:31:00 PM

Yukon HO! (for some people at least)

A friend of mine gets to play with Yukon just because he's a fancy-schmancy MVP.

Bastard. ;)

[ read more...]

--Justin Bigelow, 7/17/2003 7:42:00 PM

I've got the world on a connection string

Last night while moving a new feature for a customer from development to production I noticed what I can only describe as an anomaly. This app tracks work orders in a service center. Most of the functionality revolves around the work orders the users generate. Like most web apps I have a screen for searching for whatever resource (work orders, customer records, sales records, etc...) and then drilling down into the appropriate record. What I noticed was that the first 5 or so of the latest work orders on the production server were showing no data, just a page header. If I go down further to an older work order and drill down everything shows up fine.

My first inclination was to think that one of the fields used in an inner join was null, I thought I had built all the checks in to prevent this from happening but figured maybe somebody found a bug or other loophole that I missed.

So I fired up Query Analyzer and I queried for two records: one that shows up blank, and one that displays properly. Much to my suprise everything required was in both records. Odd.

Next I did a quick check of the stored procedure that pulls the work order details to make sure everything was coming from the right tables (prod vs. dev). I dismissed this almost before I finished writing sp_helptext because the sproc in my prod server isn't going to find the tables in my dev server. But what the hell, I checked anyway. Just as I thought, everything looked normal. I didn't bother checking the insertion stored procedure because the record looked good when I did the check with QA.

So I went to the next potential point of failure (or annoyance in this case), my aspx page. I immediately found the problem. This page was pointing to the dev server. I switched it, along with alot of the other pages over last Friday for some testing. I apparently missed this one page when I switched back. It went unnoticed by myself and the client because I keep the d

[ read more...]

--Justin Bigelow, 7/15/2003 9:06:00 AM



Great Offers
.net hosting
Go To My Pc
Remote Pc Control
zone alarm  zonealarm