You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.
Apache Airflow Provider(s)
databricks
Versions of Apache Airflow Providers
6.0.0
Apache Airflow version
2.8.1
Operating System
Amazon
Deployment
Amazon (AWS) MWAA
Deployment details
No response
What happened
When I click on the link to see job run details, it directs to https://(s3 path of my xcom json file), instead of directing to databricks workspace
What you think should happen instead
I expect the link to work and directs to job run details in our team's databricks workspace
How to reproduce
Anything else
Currently our custom XCOM backend, stores everything in S3, it does not store large values only as other XCOM, yet the behavior of the extra link should not depend on that:
I had a look at https://airflow.apache.org/docs/apache-airflow-providers-databricks/6.0.0/_modules/airflow/providers/databricks/operators/databricks.html#DatabricksJobRunLink.get_link and https://github.com/apache/airflow/blob/2.8.1/airflow/models/xcom.py#L873-L876 yet I doubt that TYPE_CHECKING is true during runtime, because if it is, then the provider won't use our custom class.
I also tried to override the orm_deserialize_value method hoping this will fix the issue, yet I ran into the issue I mentioned in this comment #44232 (comment) (contributions to this discussion are very welcome as well)
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: