Skip to main content

All Questions

1 vote
0 answers
26 views

Error parsing CSV File when copying Data to Snowflake after July 3rd Incident

I am encountering an error while trying to copy a CSV file into Snowflake from an S3 bucket. This process was functioning correctly until an incident occurred in Snowflake on July 3rd. The error ...
mict0's user avatar
  • 43
0 votes
1 answer
50 views

Snowflake - get notified of any task errors when copy is set to "on_error = continue"

I have a task on Snowflake that runs a copy command with "on_error = continue" so that if any file in my S3 bucket fails to copy the job will not abort immediately. However, I have also ...
Estrobelai's user avatar
1 vote
0 answers
55 views

Do I need to create an AWS VPC interface endpoint for loading from Amazon S3 on Snowsight?

In the following Snowflake doc for setting up AWS PrivateLink, it is stated that "The Snowflake clients (e.g. SnowSQL, JDBC driver) require access to Amazon S3 to perform various runtime ...
Pango853's user avatar
0 votes
0 answers
74 views

Load data from S3 to Snowflake, overwrite vs using stream

I'm trying to load files from S3 to Snowflake using copy into commands. The pipeline is scheduled using Airflow. Each day there would be new files (e.g. source_20240501.csv). In order to rerun a ...
user3735871's user avatar
0 votes
1 answer
369 views

load data from S3 to Snowflake, with lambda invocation in a sequential order 1 by 1

I'm trying to load data from my S3 bucket folder to snowflake table using lambda. I have setup an S3 trigger where my files are getting ingested and formed an integration between lambda and to ...
Faiz Qureshi's user avatar
0 votes
0 answers
31 views

Stage customer data to Snowflake

I need to import csv files into our Snowflake account uploaded by our customer. The exact import part can be done by COPY INTO command, but googling around I haven't found an exact solution, how can I ...
Moravas's user avatar
  • 133
0 votes
0 answers
108 views

Snowflake pipe, error when handling deletion record

I have setup Snowflake pipe to ingest S3 files, which were created by AWS DMS migration task. The pipe ingests the files in a Snowflake table. S3 file with inserts work fine, but when a file has a ...
konkani's user avatar
  • 498
0 votes
1 answer
330 views

Unloading data from Snowflake to S3 bucket in xml format

I need to unload the data I have in Snowflake to S3 bucket to a file with .xml extension. In reality, the required output in not exactly .xml, but rather comma separated values enclosed in tag. But I ...
jeremyone's user avatar
0 votes
0 answers
129 views

how to change "–" these values to original in aws s3 bucket by using snowflake file format

Can you some one help me on the below mentioned issue from Snowflake file format to AWS S3 bucket. I have requirement to unload data from Snowflake to AWS S3 bucket as a CSV format. During this ...
Shaik Bakshu's user avatar
0 votes
1 answer
36 views

Json data syntax error while fetching some column data | Snowflake

While working with json data found one below column contains scope resolution operator in its column name Column name: "system::embeddable_last_seen" how to access above column? when i was ...
Alpesh Gadgilwar's user avatar
0 votes
1 answer
609 views

While creating iceberg table in snowflake I am getting error like "Unified iceberg tables require a catalog integration to be specified"

I am trying to setup iceberg table in snowflake by following the official snowflake blog While I tried to create table using, create or replace iceberg table my_iceberg_table_test with EXTERNAL_VOLUME ...
JD-V's user avatar
  • 3,616
3 votes
1 answer
356 views

How to create Snowflake table (dynamically find data types) and load from stage (AWS S3)?

I've got a CSV, about 200 columns and over 200k rows in AWS S3. I need to load into Snowflake via stage. I did my best to ascertain the data types and "create table" but keep getting stuck ...
Chuck's user avatar
  • 1,241
0 votes
1 answer
129 views

Snowflake: How to find all errors records while loading parquet data files from aws s3 to snowflake tables

copy into db.schema.ZDATA_4 from @stage/load/zdata_4/ file_format=(TYPE=parquet ) ON_ERROR=CONTINUE Force=TRUE MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE validation_mode =return_all_errors; While ...
Sudhanshu Prakash's user avatar
0 votes
1 answer
330 views

How much time SnowPipe keeps track of files that has being already loaded

I've created a SnowPipe to load continuos data from an S3 Bucket. In the S3 Bucket I have the data compressed in parquet files, but time to time maybe this data is loaded again and it is replacing the ...
svalls's user avatar
  • 11
1 vote
1 answer
233 views

Involvement of S3 storage with JDBC queries

In our company we are working with the Snowflake-JBDC-Driver. Playing around a little bit, we noticed that not the full amount of results came back when executing a query with a large result set. Only ...
sharp_tunes's user avatar

15 30 50 per page
1
2 3 4 5
12