snowflakedb / snowflake-connector-net Goto Github PK
View Code? Open in Web Editor NEWSnowflake Connector for .NET
License: Apache License 2.0
Snowflake Connector for .NET
License: Apache License 2.0
The connector tests are trying to run the tests agains server [account].snowflakecomputing.com. There is no way to configure the tests to use other server instead, e.g. [account].eu-west-1.snowflakecomputing.com instead.
Run tests with parameters.json using server that is not located at US west AWS region.
Running tests from command line will result in this error message.
Error Message:
System.AggregateException : One or more errors occurred. (A task was canceled.)
----> System.Threading.Tasks.TaskCanceledException : A task was canceled.
Driver version:
Snowflake.Data 1.0.4
Dotnet framework and version:
.net standard 2.0
Server version:
3.0.5
Client OS:
Windows 10
The following connection string creates an Invalid key value pair error when password contains an "=":
"account=myaccount;user=myuser;password=hhh=hhh;db=mydb;host=myhost;warehouse=mywarehouse"
Error: {"Connection string is invalid: Invalid kay value pair password=hhh=hhh"}
I verified that attempting to connect with an account whose password does not contain an "=" connects just fine.
Actually while i'm trying to pull millions of records from the table, It is throwing an exception at different points, Sometimes the code will break at after pulling 4 lakh records & sometimes at 8 lakh
Below is the log file for the above issue
app.log
ConnectionString should work with valid credentials, but doesn't: "Incorrect username or password was specified."
We were able to connect to Worksheets with the same credentials .
scheme=https;port=443;user=THE_USER;password=THE_PASSWORD;account=OUR_STUFF.east-us-2.azure
There is an example in READMD.md file showing you how to enable logging.
Snowflake.Data(1.0.4)
netcoreapp2.0
Windows Server 2012R2 in Azure
Hi,
When snowflake is very busy, you risk to cancel (timeout kicked in) an async command before it actually was run. In this case, the cancel post returns a json that looks like this:
{
"data": {
"internalError": false,
"errorCode": "000605",
"age": 0,
"sqlState": "01000",
"queryId": null
},
"message": "Identified SQL statement is not currently executing.",
"code": "000605",
"success": false
}
This is then tried to be deserialized into NullDataResponse
- which fails since data
is expected to be string:
"errorType": "JsonReaderException", "errorMessage": "Unexpected character encountered while parsing value: {. Path 'data', line 2, position 12.", "stackTrace": [ "at Newtonsoft.Json.JsonTextReader.ReadStringValue(ReadType readType)", "at Newtonsoft.Json.JsonTextReader.ReadAsString()", "at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)", "at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateObject(Object newObject, JsonReader reader, JsonObjectContract contract, JsonProperty member, String id)", "at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)", "at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)", "at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)", "at Newtonsoft.Json.JsonConvert.DeserializeObject(String value, Type type, JsonSerializerSettings settings)", "at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value, JsonSerializerSettings settings)", "at Snowflake.Data.Core.RestRequestImpl.PostAsync[T](SFRestRequest postRequest, CancellationToken cancellationToken)", "at Snowflake.Data.Core.RestRequestImpl.<>c__DisplayClass11_0'1.<<Post>b__0>d.MoveNext()"
Currently active connections will expire after 4 hours an no longer be usable.
I would expect that if a connection is active as described in #37 the connection will continue to work beyond 4 hours.
Driver version: Master, any framework.
Potentially transient errors in the HttpUtil are not retried. I would expect that transient errors for the http connections would be retried automatically by the client.
Driver version: Master, any framework
Server version: 2.59.0
Right now we hard-coded the headers:
public HttpResponseMessage get(S3DownloadRequest getRequest)
{
HttpRequestMessage message = new HttpRequestMessage(HttpMethod.Get, getRequest.uri);
message.Headers.Add(SSE_C_ALGORITHM, SSE_C_AES);
message.Headers.Add(SSE_C_KEY, getRequest.qrmk);
message.Properties["TIMEOUT_PER_HTTP_REQUEST"] = getRequest.httpRequestTimeout;
CancellationTokenSource cancellationTokenSource =
new CancellationTokenSource(getRequest.timeout);
return sendRequest(message, cancellationTokenSource.Token);
}
Instead, we should use chunk headers returned by server.
Hi, I am having trouble install this nuget package v1.0.0 to the project. I updated all the dependencies prior the installation and it will only work with v.0.3.0. I found an article from nuget page, perhaps that could help solving the problem.
Nuget issue: NuGet/Home#4639
Logs:
Attempting to gather dependency information for package 'Snowflake.Data.1.0.0' with respect to project 'ProjectTest', targeting '.NETFramework,Version=v4.6'
Gathering dependency information took 14.09 sec
Attempting to resolve dependencies for package 'Snowflake.Data.1.0.0' with DependencyBehavior 'Lowest'
One or more unresolved package dependency constraints detected in the existing packages.config file. All dependency constraints must be resolved to add or update packages. If these packages are being updated this message may be ignored, if not the following error(s) may be blocking the current package operation: 'Snowflake.Data 0.3.0'
Resolving dependency information took 0 ms
Resolving actions to install package 'Snowflake.Data.1.0.0'
Resolved actions to install package 'Snowflake.Data.1.0.0'
Retrieving package 'Snowflake.Data 1.0.0' from 'nuget.org'.
Removed package 'Snowflake.Data.0.3.0' from 'packages.config'
Successfully uninstalled 'Snowflake.Data.0.3.0' from Insightly.Core.Service
Adding package 'Snowflake.Data.1.0.0' to folder 'C:\Projects\Core.2\packages'
Added package 'Snowflake.Data.1.0.0' to folder 'C:\Projects\Core.2\packages'
Install failed. Rolling back...
Package 'Snowflake.Data.1.0.0 : log4net [2.0.8, ), Newtonsoft.Json [11.0.2, )' does not exist in project 'ProjectTest'
Package 'Snowflake.Data.0.3.0' already exists in folder 'C:\Projects\Core.2\packages'
Added package 'Snowflake.Data.0.3.0' to 'packages.config'
Removing package 'Snowflake.Data.1.0.0 : log4net [2.0.8, ), Newtonsoft.Json [11.0.2, )' from folder 'C:\Projects\Core.2\packages'
Removed package 'Snowflake.Data.1.0.0 : log4net [2.0.8, ), Newtonsoft.Json [11.0.2, )' from folder 'C:\Projects\Core.2\packages'
Executing nuget actions took 2.32 min
Failed to add reference. The package 'Snowflake.Data' tried to add a framework reference to 'System.Runtime' which was not found in the GAC. This is possibly a bug in the package. Please contact the package owners for assistance.
========== Finished ==========
Time Elapsed: 00:02:36.9216035
I saw similar issue in closed issues, but it's closed because of no followup, there's no resolution. So I figured open another one. If you feel like re-open closed issue is more appropriate, please do so.
I am trying to connect to snowflake from a web app(I tried both asp.net 4.6, and core 2.1), when I try to open connection, I got:
Remote certificate has errors:
Certificate name mismatch.
Remote certificate was verified as invalid by the user.
I tried connecting from both IE and Chrome, I could connect successfully.
I tried to use Snowflake.Data.Tests project, the "SFConnectionIT" passed (connecting to the same connection string as my web app.)
I used correct values in <...> in code:
private const string connectionStringFmt = "scheme=https;host={0}.snowflakecomputing.com;" +
"user={1};password={2};account={3};role={4};db={5};schema={6};warehouse={7}";
private static readonly string connectionString = String.Format(connectionStringFmt,
"east-us-2.azure",
<user>,
<pw>,
<account>,
"SYSADMIN",
"SNOWFLAKE_SAMPLE_DATA",
"TPCH_SF001",
<wh>);
using (IDbConnection conn = new SnowflakeDbConnection()) {
try {
conn.ConnectionString = connectionString;
conn.Open();
} catch (Exception ex) {
}
System.Net.Trace.Log
System.Net Information: 0 : [16372] Remote certificate: [Version]
V3
[Subject]
CN=*.snowflakecomputing.com, OU=Secure Link SSL Wildcard, OU=snowflake computing, O=Snowflake Computing, L=San Mateo, S=CA, PostalCode=94401, C=US
Simple Name: *.snowflakecomputing.com
DNS Name: snowflakecomputing.com
[Issuer]
CN=Network Solutions OV Server CA 2, O=Network Solutions L.L.C., L=Herndon, S=VA, C=US
Simple Name: Network Solutions OV Server CA 2
DNS Name: Network Solutions OV Server CA 2
[Serial Number]
290714BCB7E17B569CE72A1C0783C4D1
[Not Before]
2016-04-04 8:00:00 PM
[Not After]
2019-04-13 7:59:59 PM
[Thumbprint]
B64B85AA164DC0FE956465D42475EA9BED2A4C52
[Signature Algorithm]
sha256RSA(1.2.840.113549.1.1.11)
[Public Key]
Algorithm: RSA
Length: 4096
Key Blob: 30 82 02 0a 02 82 02 01 00 e9 d5 8b fa 42 fa 10 5c bd 8f 29 fb 65 bf 76 c8 86 d4 ce d6 39 5a f7 de bf b7 5d b6 e1 d6 1b 96 4e e9 f0 aa 2d 18 96 a5 fa 8f ef 85 d9 b3 f8 68 e8 b3 80 2e a0 b0 f9 1d 60 57 1f 65 b4 7c 5e a4 1b 18 7e 00 64 d0 22 77 37 75....
ProcessId=11988
DateTime=2018-10-22T15:27:03.8525458Z
System.Net Information: 0 : [16372] SecureChannel#611796 - Remote certificate has errors:
ProcessId=11988
DateTime=2018-10-22T15:27:03.8575595Z
System.Net Information: 0 : [16372] SecureChannel#611796 - Certificate name mismatch.
ProcessId=11988
DateTime=2018-10-22T15:27:03.8655797Z
System.Net Information: 0 : [16372] SecureChannel#611796 - Remote certificate was verified as invalid by the user.
ProcessId=11988
DateTime=2018-10-22T15:27:03.8675862Z
System.Net.Sockets Verbose: 0 : [16372] Entering Socket#41348378::Dispose()
ProcessId=11988
DateTime=2018-10-22T15:27:03.9157128Z
System.Net Error: 0 : [16372] Exception in HttpWebRequest#22597652:: - The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel..
ProcessId=11988
DateTime=2018-10-22T15:27:03.9167167Z
System.Net Verbose: 0 : [16372] Entering HttpWebRequest#22597652::EndGetResponse()
ProcessId=11988
DateTime=2018-10-22T15:27:03.9187213Z
System.Net Error: 0 : [16372] Exception in HttpWebRequest#22597652::EndGetResponse - The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel..
ProcessId=11988
DateTime=2018-10-22T15:27:03.9197256Z
System.Net Verbose: 0 : [11036] Entering HttpWebRequest#22597652::Abort()
ProcessId=11988
DateTime=2018-10-22T15:27:19.2435434Z
System.Net Error: 0 : [11036] Exception in HttpWebRequest#22597652:: - The request was aborted: The request was canceled..
ProcessId=11988
DateTime=2018-10-22T15:27:19.2465526Z
System.Net Verbose: 0 : [11036] Exiting HttpWebRequest#22597652::Abort()
ProcessId=11988
DateTime=2018-10-22T15:27:19.2485588Z
Driver version: 1.0.4
Dotnet framework and version: I tried both .net framework 4.6.1 and .net core 2.1
Server version: 3.1.0
Region ID: east-us-2.azure
Client OS: Windows 10
We have built a simple Entity Framework provider for Snowflake. While doing so we discovered that Entity Framework infrastructure code reads and writes properties found on the SnowflakeDbParameter
class (specifically IsNullable
, Size
, SourceColumn
and SourceColumnNullMapping
), which are implemented as throwing NotImplementedExceptions
.
We've addressed this in a fork by causing these properties to return default values and to reject non-default values passed to their setters, and have submitted PR #55 as a proposed fix.
We believe this is a good compromise between throwing NotImplementedException
for all values, and allowing arbitrary values that the rest of the library ignores.
Driver version: Snowflake.Data/1.0.2
Dotnet framework and version: netstandard2.0
Server version: 2.55.1
Client OS: Windows 10
--From Prospect
Using MySql.Data.MySqlClient
System.Data.Common.DbConnection = MySql.Data.MySqlClient.MySqlConnection (Present in the Snowflake connector)
System.Data.Common.DbDataAdapter = MySql.Data.MySqlClient.MySqlDataAdapter (Not present in the Snowflake connector)
Below are examples of how we use this object.
// Example fill call
using (System.Data.Common.DbConnection conn = new System.Data.Common.DbConnection([string: Connection String])) {
conn.Open();
using (System.Data.Common.DbDataAdapter adapter = new System.Data.Common.DbDataAdapter([string: Command Text], conn)) {
adapter.Fill([System.Data.DataSet: Data Set], [string: Table Name]);
}
conn.Close();
}
// Example using async fill call
using (System.Data.Common.DbConnection conn = new System.Data.Common.DbConnection([string: Connection String])) {
conn.Open();
using (System.Data.Common.DbDataAdapter adapter = new System.Data.Common.DbDataAdapter([string: Command Text], conn)) {
System.Threading.Tasks.Task task = adapter.FillAsync([System.Data.DataSet: Data Set], [string: Table Name]);
try {
if (!task.Wait([int: Command Wait Time (milliseconds)])) {
adapter.SelectCommand.Connection.CancelQuery([int: Cancel Wait Time (seconds)]);
}
}
catch(System.Data.Common.DbException exc)
{
throw exc.InnerException;
}
}
conn.Close();
}
When spawning 4 threads to Snowflake using a simple C# code in a small warehouse only 2 are running and none are queued. The other 2 that are waiting run when the first 2 are done. However if I spawn 4 programs, each running a simple query then everything appears correct with 2 running queries and 2 queued queries.
c# project attached - You will need to connect to your own Snowflake account
[MultiThread Snowflake Tester.zip](https://github.com/snowflakedb/snowflake-connector-net/files/2349282/MultiThread.Snowflake.Tester.zip)
NA
There is an example in READMD.md file showing you how to enable logging.
Driver version:
Snowflake.Data 1.0.3
Dotnet framework and version: E.g. .net framework 4.5.2 or .net standard 2.0
4.6.1
Server version: E.g. 1.90.1
You may get the server version by running a query:
SELECT CURRENT_VERSION();
Client OS: E.g. Debian 8.1 (Jessie), Windows 10
My SQL has multiple SQL statements. While using execute reader method I am getting error {"Multiple SQL statements in a single API call are not supported; use one API call per statement instead."}
Attempting to use the ACCOUNT= connection string parameter WITHOUT a HOST= parameter will fail due to an invalid string formatter being used (%s instead of {0}
).
Even fixing the string formatter issue with the ACCOUNT= the standard URL that is built does not work and returns a 403 when we attempt to login. Explicitly setting the HOST= to: zp51084.us-east-1.snowflakecomputing.com allows us to login.
The ROLE=, DB=, SCHEMA=, WAREHOUSE= connection string parameters do not seem to work, even setting a WAREHOUSE= parameter we still need to execute a USE WAREHOUSE statement when the connection opens otherwise a warehouse will not be chosen.
Snowflake.Data 1.0.4
Here is the connection string -
conn.ConnectionString = "scheme=https;host=prod.us-east-1..snowflakecomputing.com;user=;password=;account=prod.us-east-1.*;";
conn.Open();
IDbCommand cmd = conn.CreateCommand();
cmd.CommandText = "select * from table";
IDataReader reader = cmd.ExecuteReader();
I am getting Incorrect username or password was specified error.
Every time a null value is received by the connector it breaks due to null value exceptions.
I have submitted a small change in #10 which fixes this for my purposes, and I think it's the best place to handle it, but I am not knowledgeable enough about the connector to assert this position.
If an insert or update statement causes more than int.MaxValue rows to be modified an OverflowException is thrown.
I would expect that the snowflake connector is able to handle this case w/o errors.
Driver version: Master, any framework
Server version: 2.59.0
It won't change the role from the default to ACCOUNTADMIN even though its defined in the connection string, nor is the database and maybe schema parameters are recognized.
using (IDbConnection conn = new SnowflakeDbConnection())
{
conn.ConnectionString = "account=demo16;host=XXXXX.snowflakecomputing.com;user=eddy;password=XXXXXXX;db=edy;schema=public;role=accountadmin;warehouse=etl" ;
conn.Open();
IDbCommand cmd = conn.CreateCommand();
cmd.CommandText = "select * from edy.public.eddy1;";
IDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
Console.WriteLine(reader.GetString(0));
}
conn.Close();
}
I had to add in these commands to properly switch the context:
IDbCommand cmd = conn.CreateCommand();
cmd.CommandText = "use role accountadmin";
cmd.ExecuteNonQuery();
conn.CreateCommand();
cmd.CommandText = "use database edy";
cmd.ExecuteNonQuery();
conn.CreateCommand();
cmd.CommandText = "use schema public";
cmd.ExecuteNonQuery();
Calling ExecuteNonQuery() on a COPY statement can generate an exception if the COPY statement fails and snowflake returns a single column instead of 3 that the code in ResultSetUtil.cs is expecting.
Hi, I am experiencing this odd bevahior that the SnowflakeDbConnection works in Unit Test and Console app, but it failed when connects via an asp.net app. It is a same connection string. The error message is Incorrect Username or Password was specified.
Any clue?
Here is the connection string:
scheme=https;host=dz16756.snowflakecomputing.com;port=443;user=blah;password=blah;account=dz16756;db=analytics;warehouse=analytics
Array data type is not supported when retrieving value
create table test1 ( Id int not null, xyz array null);
insert into test1 (Id) values ( 1);
Then in C# use
string sqlQuery = "select xyz from test1;";
then
if (reader.GetString(0) == null)
nullCount++;
Error : Requested value 'Array' was not found
Before forgetting this (like me for golang), I'm filing an issue here for reminder.
The SnowflakeDbTransaction does not properly Rollback a transaction upon dispose, I'm not sure where the official guidelines for extending DbTransaction are but the expected behavior is that the protected virtual Dispose method is overridden to Rollback the transaction upon Dispose if the transaction was not committed.
Formatting string with "%s" in SFSessionProperties.parseConnectionString(string, SecureString) frames invalid uri
string hostName = String.Format("%s.snowflakecomputing.com", properties[SFSessionProperties.ACCOUNT]);
it should be
string hostName = String.Format("{0}.snowflakecomputing.com", properties[SFSessionProperties.ACCOUNT]);
while i try to get the GetSchemaTable() schema of the table it shows method isn't there can you please try to add that
in c# while im debugging at conn.open() it is taking lot of time and doesnt responding back its just loading and loading
It would be amazing if you also port/integrate this .Net connector to .Net Core 2 version
Slow retrieval of data from S3 server using the .Net driver.
We are experiencing slow performance when querying data using the .Net driver. When checking the snowflake history the query execution stats the query takes less than a second, however when retrieving the actual results using SnowflakeDbDataReader the time taken to retrieve all the results is more than 10 seconds or more. This behaviour becomes a problem when we do asynchronous call, because the read time seems to be increasing linearly.
For example we are doing 30 async call each returning 50k rows and we are waitng for all the results to come in. The end-t-end time in this case is around 70 seconds, even though the actual queries in the database take milliseconds. This behaviour is also present when the query results are cached, the end-to-end time is still more than 60 seconds.
Furthermore there are two exceptions thrown when doing async calls.
Exception Call Stack attached.
Driver version: 1.0.3
Dotnet framework and version: .net standard 2.0
Server version: 2.53.1
I created the json file and copied and pasted the values and changed them. Getting error
Failed testArrayBind
Error Message:
OneTimeSetUp: Newtonsoft.Json.JsonSerializationException : Error converting value "" to type 'Snowflake.Data.Tests.TestConfig'. Path 'SNOWFLAKE_TEST_USER', line 2, position 35.
----> System.ArgumentException : Could not cast or convert from System.String to Snowflake.Data.Tests.TestConfig.
The docs seems to be wrong, HOST is required, doesn't default if its left out of the connection string.
Using NULL valued Parameter throws NotSupportedException! I was trying to execute a Query SELECT * FROM Table1 WHERE Column1 = COALESCE(:1, Column1);
as design this query expected return few data (all matched) when I supply value for :1
and entire list when I submit NULL
value for parameter :1
; But I run with NotSupportedException
on executing.
#61 seems to be similar to this.
Currently, sessions can not be renewed as the response for these is not handled correctly. The response is currently cast to a NullDataResponse, which fails and throws an exception due to the data field being an object not a string. I created a new response class and applied it to the response. I also made the session object reflect the change in token generated by snowflake.
First change your time zone to e.g. Europe/Helsinki, EEST, +3:00.
When running the mentioned test the assert
Assert.AreEqual(0, DateTime.Compare(today, reader.GetDateTime(0)));
fails, the Compare method returns 1 when the test is executed on a time zone that is ahead of UTC. Test should be runnable on any time zone.
The reason for this seems to be that the test tries to insert DateTime.Today to a date field. The time is compared to Unix Epoch UTC time and e.g. on my time zone EEST +3 the calculated time difference is three hours short of a full 24 hours. On western hemisphere this is not a problem because the calculation results in extra hours that will be floored away. Yesterday's date is written to the table instead of the date of currenct date. My suggestion would be to use DateTime.UtcNow.Date instead of DateTime.Today in the test.
Starting test execution, please wait...
Failed testGetDateTime
Error Message:
Expected: 0
But was: 1
Stack Trace:
at Snowflake.Data.Tests.SFDbDataReaderIT.testGetDateTime() in C:\work\snowflake-connector-net\Snowflake.Data.Tests\SFDbDataReaderIT.cs:line 181
Driver version:
1.0.4
Dotnet framework and version:
.net standard 2.0
Server version:
3.0.5
Client OS:
Windows 10
It seems #21 was closed prematurely because the warehouse
property in the connection string is not used in selecting the warehouse when establishing the session.
This should be automatic if the property is provided by the connection string.
Error: No active warehouse selected in the current session.
Select an active warehouse with the 'use warehouse' command.
SqlState: 57P03, VendorCode: 606, QueryId: xxx
Provided already in the test.
Failed testGetFloat
Error Message:
Snowflake.Data.Client.SnowflakeDbException : SQL compilation error:
Invalid bind value (1,23) for type (REAL).
Stack Trace:
at Snowflake.Data.Core.SFStatement.BuildResultSet(QueryExecResponse response, CancellationToken cancellationToken) in C:\work\snowflake-connector-net\Snowflake.Data\Core\SFStatement.cs:line 121
at Snowflake.Data.Core.SFStatement.Execute(Int32 timeout, String sql, Dictionary`2 bindings, Boolean describeOnly) in C:\work\snowflake-connector-net\Snowflake.Data\Core\SFStatement.cs:line 233
at Snowflake.Data.Client.SnowflakeDbCommand.ExecuteInternal(Boolean describeOnly) in C:\work\snowflake-connector-net\Snowflake.Data\Client\SnowflakeDbCommand.cs:line 256
at Snowflake.Data.Client.SnowflakeDbCommand.ExecuteNonQuery() in C:\work\snowflake-connector-net\Snowflake.Data\Client\SnowflakeDbCommand.cs:line 132
at Snowflake.Data.Tests.SFDbDataReaderIT.testGetFloat() in C:\work\snowflake-connector-net\Snowflake.Data.Tests\SFDbDataReaderIT.cs:line 118
1.0.4
Dotnet framework and version:
.net standard 2.0
Server version:
3.0.5
Client OS:
Windows 10
To add support for Snowflake to Fluent Migrator project (https://github.com/fluentmigrator/fluentmigrator) the connector must implement System.Data.Common.DbProviderFactory, whose purpose is
Represents a set of methods for creating instances of a provider's implementation of the data source classes.
In my experience very basic implementation is sufficient in the first place to allow using Fluent Migrator for Snowflake database migrations.
-using statement on connection should auto-dispose connection, but doesn't -> (should override Dispose & call Close)
-should allow connection to be opened in command?
//THIS WORKS AND CLOSES CONNECTION
var con = new Snowflake.Data.Client.SnowflakeDbConnection();
con.ConnectionString = "CONNECTION_STRING_HERE";
try
{
con.Open();
using (var cmd = new Snowflake.Data.Client.SnowflakeDbCommand(con))
{
//cmd.CommandType = CommandType.Text;
cmd.CommandText = query;
using (var da = new Snowflake.Data.Client.SnowflakeDbDataAdapter(cmd))
{
da.Fill(dt);
}
}
}
finally
{
con.Close();
}
//THIS DOESN'T WORK AND FAILS TO CLOSE CONNECTION
using (var con = new Snowflake.Data.Client.SnowflakeDbConnection())
{
con.ConnectionString = "CONNECTION_STRING_HERE";
using (var cmd = new Snowflake.Data.Client.SnowflakeDbCommand(con))
{
cmd.CommandType = CommandType.Text;
cmd.CommandText = query;
con.Open();
using (var da = new Snowflake.Data.Client.SnowflakeDbDataAdapter(cmd))
{
da.Fill(dt);
}
}
//con.Close(); //This line doesn't matter since exception happens above and gets bypassed
}
Snowflake.Data(1.0.4)
netcoreapp2.0
Calling Close() on the connection does not actually update the connection state.
Expected: State to be ConnectionState.Closed after calling Close(). Currently ConnectionState.Open.
We have hit some cases where during long 4 hour+ processing the tokens will no longer renew, and we receive the following exception.
Snowflake.Data.Client.SnowflakeDbException: Authentication token has expired. The user must authenticate again.
at Snowflake.Data.Core.SFSession.renewSession() in C:\Projects\obelisk\NuGets\thirdpartylibs\snowflake-connector-net\snowflake-connector-net\Snowflake.Data\Core\SFSession.cs:line 229
at Snowflake.Data.Core.SFStatement.<ExecuteAsync>d__25.MoveNext() in C:\Projects\obelisk\NuGets\thirdpartylibs\snowflake-connector-net\snowflake-connector-net\Snowflake.Data\Core\SFStatement.cs:line 171
---> (Inner Exception #0) Error: Authentication token has expired. The user must authenticate again. SqlState: , VendorCode: 390114, QueryId: <---
How can this be resolved? I assume that if we login again after the failure in renewSession we would be unable to access any temp tables or other session level stuff that existed on the previous session.
If the query does not return a result ExecuteScalar and ExecuteScalarAsync will throw an IndexOutOfRangeException (I think this gets swallowed and returned as a generic error IIRC).
The expected behavior is that ExecuteScalar would return DBNull in this case.
var cmd = new SnowflakeDbCommand(_conn) {CommandText = "SELECT * FROM <TABLE> WHERE 1 !=1"};
var result = cmd.ExecuteScalar();
Driver version: Master, any framework.
Server version: 2.59.0
Hi! I am trying to get the auto-generated IDENTITY
key which is a basic piece of typical INSERT
scenarios.
CREATE TABLE IF NOT EXISTS MyTable(
Id INT IDENTITY(1, 1) NOT NULL,
DoesNotMatter INT NULL);
INSERT INTO MyTable (999);
SELECT MAX(Id) FROM MyTable AT (STATEMENT => LAST_QUERY_ID());
Error:
'Time travel data is not available for table MyTable.'
The above works through the Snowflake query HTML page.
I suspect that the driver injects an extra statement/query between INSERT
and SELECT
or maybe affects session otherwise? And also the driver does not support running more than two commands/queries per command execution. As the result there is nothing usable that would give @@IDENTITY
functionality crucial for INSERT
. Please let me know if it's a bug or I am doing something wrong.
And of course documentation around the error above is very-very-very-very-very lacking. Google search gives just one exact match. It would be great to list more reasons than 1 as it seems that time travel is one of more advertised features of the product. Thank you and looking forward to some direction/solution.
The snowflake logging produces too much noise in our log, can we add a appSetting or static property to turn on and off the logging via configuration or code? Thank you.
When i tried inserting into table it started throwing object reference error Please Clear this issue or guide me how to
[2018-09-04 23:06:49,379] [1] [INFO ] [SnowFlake_Core_Test.Program] Application - Main is invoked
[2018-09-04 23:07:10,074] [8] [DEBUG] [Snowflake.Data.Client.SnowflakeDbCommand] Constucting SnowflakeDbCommand class
[2018-09-04 23:07:10,082] [8] [DEBUG] [Snowflake.Data.Client.SnowflakeDbConnection] Open Connection.
[2018-09-04 23:07:10,133] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Start parsing connection string.
[2018-09-04 23:07:10,150] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: SCHEME, value: https
[2018-09-04 23:07:10,150] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: HOST, value: juniper.snowflakecomputing.com
[2018-09-04 23:07:10,154] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: PORT, value: 443
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: USER, value: XXXXXXX
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: PASSWORD, value: XXXXXXXX
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: ACCOUNT, value: juniper
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: ROLE, value:
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: DB, value: XXXXXXXX
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: SCHEMA, value: DEV
[2018-09-04 23:07:10,155] [8] [INFO ] [Snowflake.Data.Core.SFSessionProperties] Connection property: WAREHOUSE, value: XXXXXX
[2018-09-04 23:07:10,161] [8] [DEBUG] [Snowflake.Data.Core.SFSessionProperties] Sesssion property CONNECTION_TIMEOUT set to default value: 0
[2018-09-04 23:07:10,163] [8] [DEBUG] [Snowflake.Data.Core.SFSession] Open Session
[2018-09-04 23:07:12,582] [8] [DEBUG] [Snowflake.Data.Core.SFSession] Login Request Data: SFRestRequest {url: https://juniper.snowflakecomputing.com/session/v1/login-request?warehouse=LEGAL_WH&databaseName=EDP_COMPLIANCE&schemaName=DEV&roleName=&requestId=9ea9f7ae-149f-487e-9d05-f321eac5c653, request body: AuthRequest {data: AuthRequestData {ClientAppVersion: 1.0.3.0 AccountName: juniper, loginName: ETL_COMPLIANCE, ClientEnv: { APPLICATION: dotnet, OS_VERSION: Microsoft Windows NT 6.2.9200.0, NET_RUNTIME: .NET Core 4.6.26328.01 } } } }
[2018-09-04 23:07:12,652] [10] [DEBUG] [Snowflake.Data.Core.RestRequestImpl] Http method: POST, http request message: Method: POST, RequestUri: 'https://juniper.snowflakecomputing.com/session/v1/login-request?warehouse=XXXXX&databaseName=XXXXX&schemaName=XXXX&roleName=&requestId=9ea9f7ae-149f-487e-9d05-f321eac5c653', Version: 1.1, Content: System.Net.Http.StringContent, Headers:
{
Authorization: Basic
Accept: application/snowflake
Content-Type: application/json; charset=utf-8
}
[2018-09-04 23:07:15,126] [9] [DEBUG] [Snowflake.Data.Core.HttpUtil+RetryHandler] Success Response: StatusCode: 200, ReasonPhrase: 'OK', Version: 1.1, Content: System.Net.Http.NoWriteNoSeekStreamContent, Headers:
{
Connection: keep-alive
Date: Tue, 04 Sep 2018 17:37:14 GMT
Server: nginx
Strict-Transport-Security: max-age=31536000
X-Frame-Options: deny
Content-Length: 2860
Content-Type: application/json
}
[2018-09-04 23:07:15,134] [9] [DEBUG] [Snowflake.Data.Core.RestRequestImpl] Post response: {
"data" : {
"masterToken" : "ETMsDgAAAWWlqOfHABRBRVMvQ0JDL1BLQ1M1UGFkZGluZwCAABAAEHq3hnxdNqlcxVxwd18q5Z0AAACgAdHGiIH0XkVNV6RMuQq70zPf/NMNwI1+OZOablohoyUJanYe9JXPJkFyYCJu6g/EaegX7/RwOoTOP0i8FUskNDQVxFf2Eu5pQna0L0RmR+6vYC/E2ixI4lfJbCv+qf3dRFwHVaoKw9fv46mKaHqJei0npUEyX5b3izGaoWPXX68XKVlqufRYsXraVEvzVqGPmoFdFkHH2K8s0Wu/Qbh/4QAUqd9nUlClZV/mHLrVHn+GkQSMJNg=",
"token" : "ETMsDgAAAWWlqOfXABRBRVMvQ0JDL1BLQ1M1UGFkZGluZwCAABAAEBPuLxY+QfTgp5OkdAr91poAAACATtU1aHXGWdTpSe9UMeFn5KgrMr/y7ozTqZdRbSzwE7M9zVHSzli8Vp8rvqIPzuoCVvsreo0K/jzkrlMSsiCgdl3JYK+fXn5/GQQYSs6WWLQTVJbzdxOJ22n9c2m1aZ6konPk7yblOlAqyhI9Sv0UBEMuyKXm/O7wXehC0Ih6g+gAFN8Kg6uaqdAi5OkrTLccmXNzxWWL",
"validityInSeconds" : 3600,
"masterValidityInSeconds" : 14400,
"displayUserName" : "Compliance ETL User",
"serverVersion" : "2.56.0",
"firstLogin" : false,
"remMeToken" : null,
"remMeValidityInSeconds" : 0,
"healthCheckInterval" : 45,
"newClientForUpgrade" : null,
"sessionId" : 21427593734134,
"parameters" : [ {
"name" : "TIMEZONE",
"value" : "America/Los_Angeles"
}, {
"name" : "TIMESTAMP_OUTPUT_FORMAT",
"value" : "YYYY-MM-DD HH24:MI:SS.FF3 TZHTZM"
}, {
"name" : "TIMESTAMP_NTZ_OUTPUT_FORMAT",
"value" : "YYYY-MM-DD HH24:MI:SS.FF3"
}, {
"name" : "TIMESTAMP_LTZ_OUTPUT_FORMAT",
"value" : ""
}, {
"name" : "TIMESTAMP_TZ_OUTPUT_FORMAT",
"value" : ""
}, {
"name" : "DATE_OUTPUT_FORMAT",
"value" : "YYYY-MM-DD"
}, {
"name" : "TIME_OUTPUT_FORMAT",
"value" : "HH24:MI:SS"
}, {
"name" : "CLIENT_DISABLE_INCIDENTS",
"value" : false
}, {
"name" : "CLIENT_HONOR_CLIENT_TZ_FOR_TIMESTAMP_NTZ",
"value" : true
}, {
"name" : "CLIENT_RESULT_PREFETCH_SLOTS",
"value" : 2
}, {
"name" : "CLIENT_RESULT_PREFETCH_THREADS",
"value" : 1
}, {
"name" : "CLIENT_USE_V1_QUERY_API",
"value" : true
}, {
"name" : "CLIENT_SESSION_KEEP_ALIVE",
"value" : false
}, {
"name" : "CLIENT_TELEMETRY_ENABLED",
"value" : true
}, {
"name" : "BINARY_OUTPUT_FORMAT",
"value" : "HEX"
}, {
"name" : "AUTOCOMMIT",
"value" : true
}, {
"name" : "CLIENT_PREFETCH_THREADS",
"value" : 4
}, {
"name" : "CLIENT_MEMORY_LIMIT",
"value" : 1536
}, {
"name" : "CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX",
"value" : false
}, {
"name" : "CLIENT_TIMESTAMP_TYPE_MAPPING",
"value" : "TIMESTAMP_LTZ"
}, {
"name" : "CLIENT_SESSION_CLONE",
"value" : false
}, {
"name" : "CLIENT_STAGE_ARRAY_BINDING_THRESHOLD",
"value" : 0
} ],
"sessionInfo" : {
"databaseName" : "EDP_COMPLIANCE",
"schemaName" : "DEV",
"warehouseName" : "XXXXX",
"roleName" : "XXXXX"
},
"idToken" : null,
"idTokenValidityInSeconds" : 0,
"responseData" : null
},
"message" : null,
"code" : null,
"success" : true
}
[2018-09-04 23:07:15,238] [8] [DEBUG] [Snowflake.Data.Core.SFSession] Update parameter map
[2018-09-04 23:07:15,243] [8] [DEBUG] [Snowflake.Data.Client.SnowflakeDbCommand] ExecuteNonQuery, command: insert into EDP_COMPLIANCE.DEV.TC_PROFILE_FORM values (?),(?)
Driver version:
Dotnet framework and version: .Net Core
Server version: E.g. 1.90.1
You may get the server version by running a query:
SELECT CURRENT_VERSION();
Client OS: Windows 10
This is the connection string I'm using: "scheme=https;host=snagajob.snowflakecomputing.com;port=443;account=snagajob;user=[email protected];password=MYPASSWORD;authenticator=https://snagajob.okta.com";
I know the authenticator key may not be currently supported by the .NET driver but it's what we use in different projects that utilize the JDBC driver.
The error I'm getting when trying to connect is: "Incorrect username or password was specified"
I'm using the same username and password account I use for Okta and I login to the Snowflake web ui through Okta
Was this driver tested on dot net? It doesn't seem to successfully connect to Snowflake at all.
This will resolve #37
Hi. I am a graphic designer. I volunteer to design a logo for open source projects. I can design it for you to use it in the readme file. What dou you say?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.