Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load_as_spark() returns error "No active SparkSession was found. load_as_spark requires running in a PySpark application." when used in Django Rest Api. #508

Open
Shabbir-Khan-12 opened this issue Jun 20, 2024 · 0 comments · May be fixed by #509

Comments

@Shabbir-Khan-12
Copy link

Hi all I am trying to load a delta table using load_as_spark() from the delta-sharing library in a rest API in the Django app. The issue is that when I run my Django service a spark session ( from a different app in the same project ) starts on startup and when I hit the rest API for reading the delta table using delta protocol it gives me an error:
"No active SparkSession was found. load_as_spark requires running in a PySpark application."
Due to a condition check in load_as_spark():
spark = SparkSession.getActiveSession() assert spark is not None, ( "No active SparkSession was found. " "load_as_spark requires running in a PySpark application." )

the SparkSession.getActiveSession() returns a spark session only if the spark session is from the current thread which in my case was started in a different thread and therefore I am stuck in loading data from delta table.

@Shabbir-Khan-12 Shabbir-Khan-12 linked a pull request Jun 20, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
1 participant