Thursday, March 29, 2012
Executing SP inside SP dynamically
Exec sp_executesql Nexec procedurename {parameterlist}, N{parameter declaration}, Parametervalues
Now I want to read a particular value that is being return be the procedure.
NOTE: procedure is returning a resultset.
Please help me.
Thanks!In Books OnLine, look up the keywords OUTPUT variable and RETURN.|||Hmmm...not sure if the OUTPUT parameters alone will do the job, as dynamic SQL operates within its own scope. Try it and see, but you can also use temp tables as a hack method to pass values across scopes.|||I have a strange problem, I want to execute different stored procedures based on certain criteria defined in the database. I am able to execute the sp using the sp_executesql system stored procedure.
Exec sp_executesql Nexec procedurename {parameterlist}, N{parameter declaration}, Parametervalues
Now I want to read a particular value that is being return be the procedure.
NOTE: procedure is returning a resultset.
Please help me.
Thanks!
Try This
DECLARE @.sql nvarchar(2048)
SET @.sql = ' SET @.count = ( SELECT COUNT(*) FROM table1 )'
DECLARE @.temp int
EXEC sp_executesql @.sql, N'@.count int OUTPUT', @.temp OUTPUT
Jamessql
Tuesday, March 27, 2012
Executing MDX queries from inside SQL Server Stored Procedures (SQL Server 2005)
Hello all,
Does anyone have any idea how to access a cube from stored procedures in SQL Server 2005?
My idea was to use SQLCLR and write a function in .NET that accessed the cube through ADOMD, but there are problems with that.
See the following code sample:
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
using Microsoft.AnalysisServices.AdomdClient;
public partial class UserDefinedFunctions
{[Microsoft.SqlServer.Server.SqlFunction]
public static SqlString MDXRdr()
{
AdomdConnection conn = new AdomdConnection();
conn.ConnectionString = @."Provider=SQLNCLI.1;Data Source=JOAHSE0\SQL2005;Integrated Security=SSPI;Initial Catalog=dbRenTAK";conn.Open();
// Just output cube name
string str = conn.Cubes[0].Name;
conn.Close();
return new SqlString(str);
}};
First, I wasn't able to add a reference to AdomdClient from Visual Studio. I then did add a reference manually in the project file and it compiles. But when I try to deploy, I get an error message that the "assembly adomdclient was not found in the SQL catalog".
Thanks in advance for any suggestions or hints!
Best Regards,
Johan ?hln
Consultant, IFS
Moving to the "Data Mining" Forum, which is better suited for this question.
Monday, March 26, 2012
Executing a variable inside a stored procedure
Hello :) I need to do something like this:
CREATE PROCEDURE SelectCostumers @.name varchar(100)
Declare @.SQL = "SELECT Id, Name FROM Costumers"
AS
IF (@.name IS NULL)
@.SQL
ELSE
@.SQL += "WHERE Name LIKE @.name"
See, what I need is a string variable that I can concatenate with whatever I want depending on the parameter I get.
Thank you
Hi,
Try as below.
CREATE PROCEDURE dbo.SelectCostumers
@.name varchar(100)
AS
BEGIN
DECLARE @.SQL NVARCHAR(500)
SET @.SQL = N'SELECT CustomerID, ContactName FROM Customers'
IF (@.name IS NULL)
BEGIN
EXEC sp_executesql @.SQL
END
ELSE
BEGIN
SET @.SQL = @.SQL + ' WHERE ContactName=''' + @.name + ''''
EXEC sp_executesql @.SQL
END
END
HTH
|||Thank You!!
Friday, March 23, 2012
Executing a Job in a C# SQL Server Assembly
I have a C# assembly I've created that I wish to run inside SQL Server. Since SQL Server doesn't support SMO inside of it, does anyone have any suggestions on how I might execute a SQL Server job in my assembly?
Thanks - Amos.
Using SMO in CLR:
http://sqlblogcasts.com/blogs/simons/archive/2007/03/14/Using-SMO-from-within-SQLCLR.aspx
Alternatively you can start the job using the sp_start_job procedure.
Jens K. Suessmeyer
http://www.sqlserver2005.de
Wednesday, March 21, 2012
Executing a command inside a read loop
like this:
With cmd1.ExecuteReader
Do While .Read
cmd2.CommandText = "..."
Name = CStr(cmd2.ExecuteScalar)
Loop
.Close
End With
But it fails at the ExecuteScalar method, with an invalid operation
exception.ADO.NET 1 doesn't allow sending a query against a connection that you are cu
rrently using. Either
have a separate connection for cmd2, or cache the result from cmd1 immediate
and loop that cached
result.
Or, use ADO.NET 2.0, and read up on "MARS" (multiple active resultsets). You
need to set this
attribute in the connection string.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Philip Sheard" <sheardp@.myisp.com> wrote in message news:Ois4sTHZGHA.3444@.TK2MSFTNGP05.phx
.gbl...
> How can I execute an SQL command inside a read loop? My code looks somethi
ng like this:
> With cmd1.ExecuteReader
> Do While .Read
> cmd2.CommandText = "..."
> Name = CStr(cmd2.ExecuteScalar)
> Loop
> .Close
> End With
> But it fails at the ExecuteScalar method, with an invalid operation except
ion.
>|||(I know this isn't the answer that you want to hear)
I would highly recommend looking into how your query is written. You
shouldn't have to loop through things like this. If you're trying to
append a column to the result set of a stored procedure, consider using
the INSERT EXEC syntax:
CREATE TABLE #ResultSet
(
:: <column-list>
)
INSERT #ResultSet
EXEC <procedure_name>
or, if the returned column-list is nondeterministic: SELECT a.*,
b.ColumnName FROM OPENQUERY('...', 'EXEC <procedure-name>') a,
<other_table> b
-Alan
Philip Sheard wrote:
> How can I execute an SQL command inside a read loop? My code looks somethi
ng
> like this:
> With cmd1.ExecuteReader
> Do While .Read
> cmd2.CommandText = "..."
> Name = CStr(cmd2.ExecuteScalar)
> Loop
> .Close
> End With
> But it fails at the ExecuteScalar method, with an invalid operation
> exception.|||Philip Sheard (sheardp@.myisp.com) writes:
> How can I execute an SQL command inside a read loop? My code looks
> something like this:
> With cmd1.ExecuteReader
> Do While .Read
> cmd2.CommandText = "..."
> Name = CStr(cmd2.ExecuteScalar)
> Loop
> .Close
> End With
> But it fails at the ExecuteScalar method, with an invalid operation
> exception.
If you are on SQL 2005 and use ADO .Net 2.0, you can enable MARS.
However, as Alan points out, you are probably barking up the wrong
tree. Calling back to the server for every row you get, does not
bode well for performance. You should try to get that scalar with
the first result set.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx
ExecuteSQLTask
Hey everybody,
Is there any ways that I could execute a "ExecuteSQLTask" from a script task inside the package. I mean both "ExecuteSQLTask" and "ScriptTask" are in the same package.
Any tips?
Thanks
Ryan
Just curious. Why do you want to execute "Execute SQL Task" from a script task?|||I have a set of rules in a table and depending on each rule type I have to execute a different step.|||Why do you think that being able to fire an Execute SQL Task from within a Script task will help you with this?
Your best bet would be to loop over the rules using a ForEach loop and then use conditional precedence constraints within the ForEach loop to decide which Execute SQL Task to execute.
-Jamie
sqlMonday, March 19, 2012
Execute user stored procedure in master
I have a user stored procedure in master. I would like to be able to execute that stored procedure from an internal web app. Would I give execute permission on the stored procedure to the "public" or "guest" role? The web app would be using a userid/pw for an application database.
Also, is it a good idea to have user stored procedures inside of master? Could someone point me to where I can find a good article on master db best practices?
Thanks in advance.Yeah, back it up regularly, and don't put anything in master
Why are you doing this?|||I suggest you to use this form "SELECT * FROM master.dbo.table"
and put your proc somewhere else.|||The stored procedure was originally thrown in the master db as a way of notifying admins when jobs fail. It's a stored procedure that uses certain extended stored procedures in the master db to send an e-mail message. (No we can't use SQL Mail). Could I put the stored procedure in another user db and allow the id execute permission?
Thanks again.|||I'm pretty sure you can. Did you try it?|||Thanks ortho. I copied the stored procedure from master to a user db and gave the user execute permission on it. I got the following error
EXECUTE permission denied on object 'sp_OACreate', database 'master', owner 'dbo'|||maybe dbo does not have access to master
try using the schema for the user you gave access to master OR the user name.
Did you gave execution access to dbo?
SELECT * FROM [server].[database].[user].[table]
Friday, March 9, 2012
Execute SQL Task Editor - Variables
Hi,
Im trying to do an Insert:
INSERT INTO myTable(column1)
VALUES(..)
How can I use my parameter mapping value inside the SQL statement ablove?
Thank you.
Use ? where you want to use the parameters.
INSERT INTO myTable(column1)
VALUES (?)
|||My SQL Task looks as follows:
Parameter Mapping:
User::ID Direction: Input Data Type: VarChar ParameterName: ID
SQL SourceType:
Direct Input
INSERT INTO Table1(id, name)
VALUES (1,'john')
How can I use my Parameter Mapping value inside the SQL Insert statement (instead of the 1 I put above)?
Thank you.
|||Try using a parameterName of 0 (zero)Then:
INSERT INTO Table1 (id, name) VALUES (?,'john')|||
It worked!!
Thanks a lot!!
Sunday, February 19, 2012
Execute Package Task behavior
I’m using a For Loop container to with an Execute Package Task inside, looping until a folder is empty.I’ve noticed some strange behaviors:
1. The child package keeps creating new connections.I start with 3 connections to the DB and when the For Loop container is done I’ve got 364 connections.
2. The Execute Package Task is pulling the wrong version of the package I’ve specified.I’m using a package saved to the File System and there’s only one copy on the drive.I’ve verified the path is going to the correct location.
Does anyone have a work-around for the ‘connection generation’ issue?
TIA
Eric
In your first question, do you load child packages from SQL Server? If yes - make sure you have connection pooling OFF.
Execute Package Task always fails.
I have a package (i.e. child package) which runs itself perfectly fine without displaying any error.
However when I embeded this package inside another package (i.e. parent package) using Execute Package Task. The task always fails. It seems strange enough.
The child package has two variables that need to be passed in from parent package.
Can any expert here please help out? Thanks.
- Steve
What are the errors? Look in the output window and turn on some logging.|||Error 1 Error loading Scheduling_F580021.dtsx: The connection "{34589715-3053-4b26-9769-0b8ecc198d85}" is not found. This error is thrown by Connections collection when the specific connection element is not found. C:\Scheduling_data.dtsx 1 1
here is the error:
I have kept getting this annoying error.
|||Just doulble check the connection manager that is being used by the Execute PAckage task. I would drop the connection mamanger and add it back from the Execute Package task: COnnection : <New Connection>.
Rafael Salas
|||Here is what I did to fix it ...
i delete the troubled child package completely, and rewrote it from the sratch. And it started to work.
Just as a reminder:
Please note that the old trouble child package contains some copy-and-pasted tasks (and connections) from parent package. I often use copy-and-paste to develop packages. Seems that there are some problem with it. Not sure if SSIS has a bug or something else.
|||
The problem looks like it was with a task which had been pasted in. Most tasks actually store the connection ID, the "{34589715-3053-4b26-9769-0b8ecc198d85}" GUID, rather than the name, although you generally always see the name. If you see the GUID it normally means that the connection with that ID does not exist, and therefore it cannot translate the ID to the name for display and general use or validation. In otherwords the task refers to a connection that does not exist.
Using the ID over the name means that you can rename connections and stuff keeps working, but conversely means that you cannot just create a connecion with a matching name, and expect it to be the same.
|||
you made a good point.
It should be more cautious to copy-and-paste tasks, connection, or packages for rapid SSIS development. A package works perfectly fine using copy-and-paste task/connection if it runs itself.
This does cause a pain when developing a complex package (parent-child package) if we cannot use copy-and-paste. Is Microsoft SSIS team aware of it or do they have any plan to fix it ?
|||Personally, I do not see that such pain you are taking about. If you are copying a DataFlow task from one package to another; you would need eventually to check or re-define your connection managers; since you may need the same combination of tasks but different connections. Changing the connection managers inside of a task is just a click and select from a dropdown list task.
In the other hand, if you need more than copying the dataFlow task, meaning you also need the references to the connection managers; then you are better creating a copy of the whole package and then adding and droping tasks/logic as needed
This is just my opinion...
Rafael Salas
|||
I not only copy-paste tasks from parent package to child packages, I copy-paste all connections as well. I used the same connections which configures on the fly based on the ConnectionString. The connectionString is a variable which is assembled and determined at the runtime.
I also copy packages too all times. I use copy-paste on everything for rapid development. Somewhere, the strange errors come up when I tried to build a complex package (like grand-parent package). And it is very hard to debug. I know and have used debugging tools and logging as well. It can be painful without knowing the code behind the scene. I run all of packages with settings (i.e. connections, queries, and tasks) to be configured on the fly.
|||Perhaps you should look at making some packages re-usable if you do so much copy and paste. It is often easier said than done, but since you say you do it so much and you find this such an issue I'd be surprised if you could not get some reuse.
The benefit of the current behaviour is the ability to rename connections without breaking task references, something that got a lot of feedback during the beta, and for me that is very usefull. Make a suggestion though - http://connect.microsoft.com
|||
As a matter of fact, I have exploited a lot of reuse feature as much as I can.
Here is the scenario: Say, I need to develop 300 packages. I do not want to create a blank package to start with. Often times, I use Copy-and-paste some similar package and rename it. That could save me a lot of time. If I start with an existing package, it may save me a lot of time.
Steve
Execute package on a for each loop container
Hi. I am running a package using 'execute package' inside a for each loop file iteration. I have to iterate through some 10,000s of files to ETL source data, a resonably complex package. I want to set going and get some sleep but every 200 files or so I get a memory exhausted error and the task fails. Is there a short easy fix to refresh memory each iteration. If so please let me know. Just ignoring the failure at task level won't refresh the memory, will it?
Thanks for the help.
Canyou try setting the ExecutePackage task to have 'OutOfProcess=True' - I am not sure if that will work but it is worth trying.Execute multiple statements at once...
I am not even sure if this is possible but I want to be able to excute several statements (or SPs) at once from inside a SPs (or any other method).
What I am doing is a I am taking data from a single column, multiple rows and making it into one row (i.e. data1 + data2 + data3....) But I am doing this to a total of 2.1 million individual rows and the result will be about 204k rows.... what I have written is basically a nested loop and it works fine but very slow... slow as in it has been running for 24 hours now and it is about 60% done... I need this to finish in under 36 hours preferably...
If I could handle more than one set of data at once (there will be no duplicates) I could speed up the process by how ever many I feel like working with...
So is there a way to execute a statement (sp or function) and go to the next statement without waiting for the first to finish?
Thanks Big Time... I hope this is possible.
OH.... SQL 2000 SP4 on Windows 2003.
You could open multiple connections from the client, but something tells me that there may be a better way to process your data.
Could you post the relevent schema and code?
|||AFAIK within SQL server 2000 you do not have such a chance. On a single CPU, I doubt the case would be different under 2005 (but not an MSSQL expert).
However "24 hours" and even "hours" for 2.1 million rows processing to create a 204K data sounded to be very slow to me. Maybe if you select the data outside of SQL server with an isolation like uncommitted, prepare outside and do a single bulk load, it'd be much faster.
How do you do this? All in a single T-SQL batch?
|||It is all within a single tsql SP.... It is extremely slow and I can't seem to get it to use more processor or memory...
I am thinking using a DTS package and working with multiple temp tables. Then combining all the temp tables in the end... it should be quicker that way and I might be able to push my server (which is enterprise with 12GB of ram and 2 processors)...
current schema is :
Final Table = ID_NUM (Key, char(10)), data (varchar(6000))
Working table is ID_NUM, date varchar(100), importance (smallint) - key is ID_NUM and DATA
Working temp table data varchar(100), importance smallint -- no key, always less than 200 rows.
Plus one variable @.data that is a varchar(6000)
Loop 1
Grab an ID_NUM
get list of data for ID_NUM into working table
Loop 2
Loop through till all rows have been added together or it goes over 6000 characters
@.data = Data + data + data
end loop 2
insert ID_NUM, @.DATA into final table
end loop 1
That is what I am going... no cursor, just while loops...
|||William:
Where you say, "... date varchar (100), ... " for your "Working table" do you really mean "... data varchar (100) ... "?
|||Dave
Yes that is what I mean...
sorry... working on a million things a once and don't always proof read..
|||It sounds like you're just trying to concatenate all of your rows in the child table into a single row in a third table. If so, the following scenario may help.
http://databases.aspfaq.com/general/how-do-i-concatenate-strings-from-a-column-into-a-single-row.html
You'll likely not get much parallelism in looping code that you desribed so if you can find a set-based method of processing then you'll be much better off.
|||That actually looks good... the problem will be hitting the varchar limit.
I will have to see if I hit it or not...
What fun... lol...
Thanks for all the help.
|||My apologies - I forgot you're on SQL Server 2000. The XML method of processing won't work in this situation. You still want a process where you can build the comma-separated list in a SET operation rather than one row at a time.
There's a method whereby you can assign a value to a variable in one row and use it in subsequent rows of an UPDATE statement. AFAIK, it's undocumented but it works in SQL 2005 as well so you're safe for a little while.
The pseudo-code would look something like this:
-- Place all of the data into a work table
CREATE TABLE #WorkTable (ID_NUM char(10), data varchar(6000), importance smallint, IdentCol INT IDENTITY, PRIMARY KEY(ID_NUM, IdentCol)
INSERT INTO #WorkTable (ID_NUM, data, importance)
SELECT ID_Num, data, Importance
FROM SourceTable
ORDER BY ID_NUM, Importance
-- Update work table. Each row has the data of itself + the row preceding it when there is a matching ID_NUM
DECLARE @.ID_NUM CHAR(10)
DECLARE @.data VARCHAR(6000)
UPDATE #WorkTable
SET @.data = data = CASE WHEN ID_NUM <> @.ID_NUM THEN data
ELSE @.data + data
END
,@.ID_NUM = ID_NUM
INSERT INTO FinalTable(ID_NUM, Data)
SELECT ID_NUM, MAX(DATA)
FROM #WorkTable
GROUP BY ID_NUM
In case the pseudo-code doesn't work, here's a working sample for Adventureworks database:
CREATE TABLE #JobTitles (job_id INT, job_desc VARCHAR(100), employees VARCHAR(1000), EmployeeName VARCHAR(100), PRIMARY KEY (job_id,
EmployeeName))
DECLARE @.job_id INT, @.EmployeeNames VARCHAR(1000)
INSERT INTO #JobTitles(job_id, job_desc, EmployeeName) SELECT J.job_id, J.job_desc, E.lname +', ' + E.fname AS EmployeeName FROM jobs J INNER JOIN employee E ON J.job_id = E.job_id
SET @.job_id = -1 -- Initialize this to a number that does not exist
-- in the list of values.
UPDATE #JobTitles
SET @.EmployeeNames = employees = CASE
WHEN @.job_id <> job_id THEN EmployeeName
ELSE @.EmployeeNames + '; ' + EmployeeName
END
, @.job_id = job_id
-- Grab the largest row of each grouping.
SELECT job_id, job_desc, MAX(employees) AS EmployeeList FROM #JobTitles GROUP BY job_id, job_desc ORDER BY job_id
DROP TABLE #JobTitles
|||William:
Here is yet another example. I mocked this up with 2.1 million rows. I was able to get this to run in about 3 minutes and 20 seconds. This problem is definitely easier to solve with 2005 than 2000 and there has to be a better way of doing this, but this example might also help you see some direction; I hope it is helpful.
Dave
I used these two tables and faked the data:
create table dbo.final
( [key] char (10) not null,
[data] varchar (6000) not null
)create table workingTable
( ID_Num varchar (10) not null,
[data] varchar (100) not null,
importance smallint not null,constraint pk_mockInput primary key (ID_Num, [data])
)
insert into workingTable
select convert (varchar (5), a.iter),
'{ ' + convert (varchar (5), b.iter) + ' }'
+ replicate ('-', 1 + b.iter % 29 ) + '-->',
b.iter
from small_iterator a (nolock)
inner join small_iterator b (nolock)
on a.iter <= 21300
and b.iter <= 40 + a.iter % 119
and b.iter <= 200
order by a.iter,
b.iterupdate statistics workingTable
exec sp_recompile workingTable
I would prefer not to use the function because there is a fair amount of overhead associated with functions; nonetheless, I implemented the method using the following function:
alter function dbo.assembleData
( @.arg_key varchar (10),
@.arg_minData varchar (100),
@.arg_maxData varchar (100)
)
returns varchar (6000)
asbegin
declare @.retValue varchar (6000)
set @.retValue = ''select @.retValue = @.retValue + [data]
from workingTable
where [id_num] = @.arg_key
and [data] >= @.arg_minData
and [data] <= @.arg_maxDatareturn ( @.retValue )
end
I then tested out the overall procedure with this query:
|||--truncate table dbo.final
-- The whole thing takes about 3 minutes and 20 seconds
create table #sequencer
( [key] varchar (10) not null,
[data] varchar (100) not null,
importance smallint not null,
seq integer not null,
siz integer not null,constraint pk_#sequencer primary key ([key], [data])
)insert into #sequencer
select a.id_num,
a.[data],
a.importance,
count(*) as seq,
sum (datalength (b.[data])) as siz
from workingTable a
inner join workingTable b
on a.id_num = b.id_num
and b.[data] <= a.[data]
group by a.id_num, a.[data], a.importance
order by a.id_num, a.[data]insert into dbo.final
select [key],
dbo.assembleData ( [key], min_data, max_data )
from ( select [key],
min ([data]) as min_data,
max ([data]) as max_data,
min (seq) as min_seq,
max (seq) as max_seq,
case when siz = 0 then 0
else siz - 1
end / 6000 as segment
from #sequencer
group by [key],
case when siz = 0 then 0
else siz - 1
end / 6000
) xdrop table #sequencer
select top 20 *
from dbo.final
I will give that method a try...
I have been trying all kinds or things including DTS with multiple connections
all have been unsuccessful in speedy results... my last attempt took about 10 hours
THAT WORKED GREAT!!!!!
My job now completes in 1 min 43 sec.... HUGE improvement...
Thanks.
|||Great you solved it.
Would you tell me which method you used (I know that not the one I said:). I'm asking because I'm not SQL server oriented and for the last few days seriously diving into SQL server. Before I was using it "superficially". Though I was impressed with T-SQL in 2000, 2005 made me an addict:)
|||I used the solution posted by Jared KoFriday, February 17, 2012
Execute DTS Packages
now I want to be able to execute it outside of EM. The package imports
data from an SQL dbase to a Visual FoxPro dbase. I would like this DTS
package to execute everytime an Insert(of certain critera) is made on
the SQL dbase. I am thinking that using an Insert Trigger on the SQL
table is the way to go about this. I am seeking advice as to... is this
the best way to go about this.
Also, and if so, in a trigger which route is the best to take...
1. use dtsrun command line utility or
2. setup a COM object to run the DTS package
I have tried both in SQL query analyzer and am having troubles with the
syntax. Could anyone possibly send the code for both ways. Thanks.
Good Day to all,
BrettIts not clear why you would want to do this... What are you doing in
the DTS that you cant do thru normal trigger? It would help if you can
explain more....
Not sure if you can run the package thru trigger.. May be set up a job
and invoke a job... I am also curious...|||I have tried linking SQL database to the VFP tables with T-SQL and have
had no luck. I have tried for weeks to link the two via Remote Servers
or T-SQL and working with directly in triggers. So far this is what I
have been able to get to work. If you have other ideas or any help
would be appreciated.
Brett|||You should be able to fire up DTSRUN using sp_oacreate in your trigger.
Have you tried this?
Having a job that calls the DTS is another option, then start the job
in your trigger with msdb..sp_start_job or something like that.
I have done both of these.|||Yes I have tried this without sucess. The package executes and works
properly when I run it manualy in EM. However, when I place the code
listed below in SQL Query Analyzer it does not execute. The code
parses fine, no errors and when it is executed QA returns 'completed
the command successfully', but nothing has happened. Here is the
code...
DECLARE @.object int
DECLARE @.hr int
--create a package object
EXEC @.hr = sp_OACreate 'DTS.Package', @.object OUTPUT
if @.hr <> 0
BEGIN
print 'error create DTS.Package'
RETURN
END
EXEC @.hr = sp_OAMethod @.object,
'LoadFromSQLServer("ShannonPC", "sa", "", 256, , , ,
"MiTekTransfer")', NULL
IF @.hr <> 0
BEGIN
print 'error LoadFromStorageFile'
RETURN
END
EXEC @.hr = sp_OAMethod @.object, 'Execute'
IF @.hr <> 0
BEGIN
print 'Execute failed'
RETURN
END
Any ideas as to why this is not running my DTS package.
Thanks|||This code shows an example, I have used it before with success:
http://www.databasejournal.com/feat...cle.php/1459181
The first problem I see is, your LoadFromSQLServer parameter "256" is
incompatible with passing in "sa" as your username. "256" means
DTSSQLStgFlag_UseTrustedConnection, which says to use a trusted
connection. So my guess is that you're having a permissions problem.
Change the "256" to "0" and try again.
I am surprised that you got no error message back from sp_oacreate
though...
Gary|||Yes, I recongized this issue after I made the post and did some further
searching. If I use win. auth. in the package I set parameter to 256,
with no user or pass, and if I set SQL auth. in the package I set
parameter to 0 with user sa. I have tried both with the same
results... no errors and no execution.|||Do you have DTS package logging turned on? Can you tell it's not even
starting the package?
Try just starting DTSRUN.exe from a command line, and see if that
works.
sp_oacreate may be disabled as a security measure, but if it was I
would expect that you would get an error back.
Can you try putting a dtsrun.exe call in a job? Run the job and see if
it starts the DTS package. If not you have bigger problems.
If it works in the job, use msdb..sp_start_job to start the job from
the trigger.|||It seems that the issue might be with connecting to a Visual Fox Pro
table outside of the DTS program. The package which transfers from SQL
to VFP works in DTS but not in SQL Query Analyzer. I setup up another
dts package which transfers the same data but from SQL to a text file.
This package executes properly in DTS and SQL QA. It operates under
the same code as posted above, with changing the package name. The VFP
table are on the local machine though.
I also tried exec master..xp_cmdshell 'dtsrun /Sshannonpc /E256 /
Npackagename. This executes the package with the text file but not the
VFP tables. I get the error 'Invalid class string.' However, no
errors occur when runing the package manually in DTS. Ideally I would
like to import right into the VFP tables and not have another app that
reads from the text file then to the VFP tables. ?|||When you say "The VFP table are on the local machine", what do you
mean? On your local desktop machine, or on the database server?
Sometimes people get local/server paths mixed up. The path to the VFP
needs to be reachable via the server, not your local PC. Is this the
problem?|||Gary,
I am running SQL Sever 2000 on my local desktop machine and the Visual
FoxPro tables are located on the c:\ drive on this same computer.
Maybe you could answer this... In SQL Query Analyzer this command
**(exec master..xp_cmdshell 'dtsrun /SShannonPC /E256
/NMiTekTransfer')** to run the DTS Package gives me this error:
DTSRun OnError: DTSStep_DTSDataPumpTask_1, Error = -2147221005
(800401F3) Error String: Invalid class string
However if I run **(dtsrun /SShannonPC /E256 /NMiTekTransfer)** from
the command prompt it execute fine. The DTS package also executes
manually form the DTS window.
Any thoughts? Thanks for helping.
Brett
Execute DTS package inside a storedprocedure...
ThanksIn theory, you could run DTSRUN from inside a stored procedure, like so:
CREATE PROCEDURE usp_ImportData ASDECLARE @.DTSRUNCommand varchar(255)
TRUNCATE TABLE MyTable
SET @.DTSRUNCommand =
'DTSRUN /N ' +
'MyImportPackage /E /S ' +
@.@.SERVERNAMEEXEC master..xp_cmdshell @.DTSRUNCommand
This is using xp_cmdshell, so the SQL user will have to have rights to run it...|||Cool...I will try using it and see if i can get it to work.
Thanks|||It worked awsome...
Thanks
Wednesday, February 15, 2012
Execute Dos Commands in T-SQL Script
I know there is a way to execute DOS commands inside T-Sql script, but can't
seem to find it. For example: using the MOVE command inside a stored
procedure to move .txt files into another directory after they've been
imported.
Can someone point me in the right direction?
Thanks ...The thing you are looking for is called xp_cmdshell
Have a look in the BOL for the details. You need to be sy
command
kind regards
Greg O
Need to document your databases. Use the first and still the best AGS SQL
Scribe
http://www.ag-software.com
"bill_morgan" <billmorgan@.discussions.microsoft.com> wrote in message
news:BBD58F20-E45C-418D-9B5A-E1221B81BE05@.microsoft.com...
> Friends,
> I know there is a way to execute DOS commands inside T-Sql script, but
> can't
> seem to find it. For example: using the MOVE command inside a stored
> procedure to move .txt files into another directory after they've been
> imported.
> Can someone point me in the right direction?
> Thanks ...
>|||Thanks, Greg ... I did run into that one, but wasn't getting it to work righ
t
(am running it on my own version of SQL Server so shouldn't be having an
Admin problem).
Now that I'm sure that's what I need to pursue, I'll give it another try.
"GregO" wrote:
> The thing you are looking for is called xp_cmdshell
>
> Have a look in the BOL for the details. You need to be sy
is
> command
>
> --
> kind regards
> Greg O
> Need to document your databases. Use the first and still the best AGS SQL
> Scribe
> http://www.ag-software.com
> "bill_morgan" <billmorgan@.discussions.microsoft.com> wrote in message
> news:BBD58F20-E45C-418D-9B5A-E1221B81BE05@.microsoft.com...
>
>|||Basic question I assume is also true, you are a SysAdmin on the server?
Where are you trying to make calls to? If it is on a network share, then
your SQL Server service account needs permissions there. If you are running
under a local system account, then you will have to grant the machine
account access to the share.
Here is a basic statement that should return values for you:
exec xp_cmdshell 'dir c:'
"bill_morgan" <billmorgan@.discussions.microsoft.com> wrote in message
news:CC18C513-D5DB-44DB-B849-BFB29965B1A8@.microsoft.com...
> Thanks, Greg ... I did run into that one, but wasn't getting it to work
> right
> (am running it on my own version of SQL Server so shouldn't be having an
> Admin problem).
> Now that I'm sure that's what I need to pursue, I'll give it another try.
> "GregO" wrote:
>