repo stringclasses 32
values | instance_id stringlengths 13 37 | base_commit stringlengths 40 40 | patch stringlengths 1 1.89M | test_patch stringclasses 1
value | problem_statement stringlengths 304 69k | hints_text stringlengths 0 246k | created_at stringlengths 20 20 | version stringclasses 1
value | FAIL_TO_PASS stringclasses 1
value | PASS_TO_PASS stringclasses 1
value | environment_setup_commit stringclasses 1
value | traceback stringlengths 64 23.4k | __index_level_0__ int64 29 19k |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Qiskit/qiskit | Qiskit__qiskit-6102 | 0c8bb3dbf8d688590431ca79a83ba8aede84ed20 | diff --git a/qiskit/opflow/operator_base.py b/qiskit/opflow/operator_base.py
--- a/qiskit/opflow/operator_base.py
+++ b/qiskit/opflow/operator_base.py
@@ -14,6 +14,7 @@
import itertools
from abc import ABC, abstractmethod
+from copy import deepcopy
from typing import Dict, List, Optional, Set, Tuple, Union, cast
... | `TaperedPauliSumOp` is incompatible with `VQE`
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: `master` @ 0c8bb3dbf8d688590431ca79a83ba8aede84ed20
- **Python version**: 3.8.8
- **... | @ikkoham I am tagging you because (if I recall correctly) you mainly worked on the two-qubit-reduction and tapering code. | 2021-03-29T01:03:38Z | [] | [] |
Traceback (most recent call last):
File "terra-bug-minimal.py", line 31, in <module>
result = vqe.compute_minimum_eigenvalue(tapered_qubit_op)
File "/home/oss/Files/Qiskit/src/qiskit-terra/qiskit/algorithms/minimum_eigen_solvers/vqe.py", line 412, in compute_minimum_eigenvalue
self._expect_op = self.co... | 1,736 | |||
Qiskit/qiskit | Qiskit__qiskit-6213 | 2b7046f886e090bbbf22a989ff8130b6bd283d5c | diff --git a/qiskit/pulse/builder.py b/qiskit/pulse/builder.py
--- a/qiskit/pulse/builder.py
+++ b/qiskit/pulse/builder.py
@@ -740,6 +740,8 @@ def seconds_to_samples(seconds: Union[float, np.ndarray]) -> Union[int, np.ndarr
Returns:
The number of samples for the time to elapse
"""
+ if isinstance(... | Pulse: seconds_to_samples() fails when passed an array
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.16.1
- **Python version**: 3.7
- **Operating system**: Linux (under WSL)
... | 2021-04-13T14:10:20Z | [] | [] |
Traceback (most recent call last):
File "./test_qp_seconds_to_samples.py", line 7, in <module>
time_dt = qp.seconds_to_samples(times_s)
File "/nix/store/0wf80vpsyn5jmnarclzmdblwyaanhpw0-python3.7-qiskit-terra-0.16.1/lib/python3.7/site-packages/qiskit/pulse/builder.py", line 665, in seconds_to_samples
r... | 1,746 | ||||
Qiskit/qiskit | Qiskit__qiskit-6228 | cf1241cbe1b24ce25865dc95e7a425456dd5b4cf | diff --git a/qiskit/pulse/builder.py b/qiskit/pulse/builder.py
--- a/qiskit/pulse/builder.py
+++ b/qiskit/pulse/builder.py
@@ -740,6 +740,8 @@ def seconds_to_samples(seconds: Union[float, np.ndarray]) -> Union[int, np.ndarr
Returns:
The number of samples for the time to elapse
"""
+ if isinstance(... | Pulse: seconds_to_samples() fails when passed an array
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.16.1
- **Python version**: 3.7
- **Operating system**: Linux (under WSL)
... | 2021-04-14T15:32:30Z | [] | [] |
Traceback (most recent call last):
File "./test_qp_seconds_to_samples.py", line 7, in <module>
time_dt = qp.seconds_to_samples(times_s)
File "/nix/store/0wf80vpsyn5jmnarclzmdblwyaanhpw0-python3.7-qiskit-terra-0.16.1/lib/python3.7/site-packages/qiskit/pulse/builder.py", line 665, in seconds_to_samples
r... | 1,749 | ||||
Qiskit/qiskit | Qiskit__qiskit-6377 | f21d991d09a0ef2c47605df750687b67462fc1e6 | diff --git a/qiskit/result/__init__.py b/qiskit/result/__init__.py
--- a/qiskit/result/__init__.py
+++ b/qiskit/result/__init__.py
@@ -24,9 +24,22 @@
ResultError
Counts
marginal_counts
+
+Distributions
+=============
+
+.. autosummary::
+ :toctree: ../stubs/
+
+ ProbDistribution
+ QuasiDistribution
+
... | `from qiskit.providers.ibmq import least_busy` raises an ImportError with Terra's main branch
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: latest(main)
- **Python version**: 3.8... | 2021-05-08T10:40:59Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/ima/envs/dev38/lib/python3.8/site-packages/qiskit/providers/ibmq/__init__.py", line 88, in <module>
from .ibmqfactory import IBMQFactory
File "/Users/ima/envs/dev38/lib/python3.8/site-packages/qiskit/providers/ibmq/ibmqfa... | 1,779 | ||||
Qiskit/qiskit | Qiskit__qiskit-6588 | 8c062c777246e386b7306e94aa9e8094b2a16416 | diff --git a/qiskit/circuit/classicalfunction/boolean_expression.py b/qiskit/circuit/classicalfunction/boolean_expression.py
--- a/qiskit/circuit/classicalfunction/boolean_expression.py
+++ b/qiskit/circuit/classicalfunction/boolean_expression.py
@@ -12,14 +12,14 @@
"""A quantum oracle constructed from a logical exp... | Missing tweedledum as required package
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.17.0
- **Python version**: 3.8.8
- **Operating system**: linux
### What is the current ... | Tweedledum is an optional dependency, mostly because it hasn't fully stabilized the python api yet and also because there are some packaging issues for some of our supported platforms. However, it is correctly listed in the setup.py:
https://github.com/Qiskit/qiskit-terra/blob/main/setup.py#L108
which enables yo... | 2021-06-16T15:41:06Z | [] | [] |
Traceback (most recent call last):
File "<ipython-input-97-28069418327a>", line 3, in <module>
oracle = PhaseOracle('x & ~y') # previous API: qiskit.aqua.components.oracles.LogicalExpressionOracle
File "/opt/conda/lib/python3.8/site-packages/qiskit/circuit/library/phase_oracle.py", line 55, in... | 1,811 | |||
Qiskit/qiskit | Qiskit__qiskit-6658 | 21b875d9b0443567452e40431bb6b783563142a1 | diff --git a/qiskit/circuit/classicalfunction/boolean_expression.py b/qiskit/circuit/classicalfunction/boolean_expression.py
--- a/qiskit/circuit/classicalfunction/boolean_expression.py
+++ b/qiskit/circuit/classicalfunction/boolean_expression.py
@@ -25,15 +25,19 @@
class BooleanExpression(ClassicalElement):
"""T... | Grover's Migration from Aqua to Qiskit Terra
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Informations
- **Qiskit version**: 0.26.0
- **Python version**: 3.8
- **Operating system**: MacOS
### What is the current b... | Good catch! That change was not intended (at least no silently). Thanks for reporting this, we're looking into it 👍🏻
Tweedledum is an optional dependency, mostly because it hasn't fully stabilized the python api yet and also because there are some packaging issues for some of our supported platforms. However, it is ... | 2021-06-29T12:41:36Z | [] | [] |
Traceback (most recent call last):
File "<ipython-input-97-28069418327a>", line 3, in <module>
oracle = PhaseOracle('x & ~y') # previous API: qiskit.aqua.components.oracles.LogicalExpressionOracle
File "/opt/conda/lib/python3.8/site-packages/qiskit/circuit/library/phase_oracle.py", line 55, in... | 1,823 | |||
Qiskit/qiskit | Qiskit__qiskit-6847 | 6c9c906f12dc6b8d9929cd8b1e2e28601ac9d827 | diff --git a/qiskit/circuit/parameterexpression.py b/qiskit/circuit/parameterexpression.py
--- a/qiskit/circuit/parameterexpression.py
+++ b/qiskit/circuit/parameterexpression.py
@@ -265,7 +265,7 @@ def _apply_operation(
return ParameterExpression(parameter_symbols, expr)
- def gradient(self, param) -> ... | ParameterExpression throws an error for gradients with complex coefficients.
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: 0.18
- **Python version**: 3.8
- **Operating system**:... | 2021-07-30T20:34:57Z | [] | [] |
Traceback (most recent call last):
File "/qiskit-terra/venv/lib/python3.8/site-packages/IPython/core/interactiveshell.py", line 3437, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-28-058190e5d73e>", line 1, in <module>
p2.gradient(p)
File "/qiskit-terra/qiskit/ci... | 1,846 | ||||
Qiskit/qiskit | Qiskit__qiskit-6930 | 1eb668171722fb60abd90f942ca5b65ac56d9b34 | diff --git a/qiskit/visualization/__init__.py b/qiskit/visualization/__init__.py
--- a/qiskit/visualization/__init__.py
+++ b/qiskit/visualization/__init__.py
@@ -126,7 +126,7 @@
from qiskit.visualization.transition_visualization import visualize_transition
from qiskit.visualization.array import array_to_latex
-fro... | Transpiler raises an exception "Command 'pdflatex -v' returned non-zero exit status 1" on Windows
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Information
- **Qiskit Terra version**: main branch (commit b419e3968f7a88f... | 2021-08-20T12:39:30Z | [] | [] |
Traceback (most recent call last):
File "C:/.../qiskit-terra/_sandbox/pdflatex_bug.py", line 1, in <module>
from qiskit import QuantumCircuit, transpile
File "C:\...\qiskit-terra\qiskit\__init__.py", line 52, in <module>
from qiskit.execute_function import execute # noqa
File "C:\...\qiskit-terra\q... | 1,863 | ||||
Qiskit/qiskit | Qiskit__qiskit-710 | bddb8af58ddcb97f656bbcdd6b460edadd3f6508 | diff --git a/qiskit/_compositegate.py b/qiskit/_compositegate.py
--- a/qiskit/_compositegate.py
+++ b/qiskit/_compositegate.py
@@ -101,6 +101,11 @@ def inverse(self):
self.inverse_flag = not self.inverse_flag
return self
+ def reapply(self, circ):
+ """Reapply this gate to corresponding qu... | Can not combine or extend a circuit which is built with CompositeGate
<!-- ⚠️ If you do not respect this template, your issue will be closed -->
<!-- ⚠️ Make sure to browse the opened and closed issues -->
### Informations
Can not use `+` operator to combine or extend a circuit.
- **Qiskit (Python SDK) version**:... | 2018-08-01T17:34:04Z | [] | [] |
Traceback (most recent call last):
File "qiskit_acqua/svm/svm_qkernel.py", line 327, in <module>
a.run()
File "qiskit_acqua/svm/svm_qkernel.py", line 306, in run
self.train(self.training_dataset, self.class_labels)
File "qiskit_acqua/svm/svm_qkernel.py", line 182, in train
kernel_matrix = self.... | 1,890 | ||||
Qiskit/qiskit | Qiskit__qiskit-7389 | 5c4cd2bbcfaf32f4db112f76f8f13128872afde5 | diff --git a/qiskit/circuit/controlflow/break_loop.py b/qiskit/circuit/controlflow/break_loop.py
--- a/qiskit/circuit/controlflow/break_loop.py
+++ b/qiskit/circuit/controlflow/break_loop.py
@@ -15,7 +15,7 @@
from typing import Optional
from qiskit.circuit.instruction import Instruction
-from .builder import Instru... | `DAGCircuitError` when `if_test` is called in `for_loop` scope.
### Environment
- **Qiskit Terra version**: 0.19.0
- **Python version**: any
- **Operating system**: any
### What is happening?
Encounter `DAGCircuitError` when `if_test` is called in `for_loop` scope.
### How can we reproduce the issue?
```
from... | The issue here is that the control-flow builder blocks don't track the registers that are used, only the bits, so they don't define any necessary registers in their body blocks. We need to add tracking of these to `ControlFlowBuilderBlock`, and add them when the circuits are created to fix this.
It's a `DAGCircuitE... | 2021-12-09T16:45:30Z | [] | [] |
Traceback (most recent call last):
File "nest_error.py", line 14, in <module>
print(circ.data[0][0].params[2])
File "site-packages/qiskit/circuit/quantumcircuit.py", line 364, in __str__
return str(self.draw(output="text"))
File "site-packages/qiskit/circuit/quantumcircuit.py", line 1865, in draw
... | 1,934 | |||
Qiskit/qiskit | Qiskit__qiskit-7407 | 93a8172c3ca8fda510393087a861ef12d661906f | diff --git a/qiskit/circuit/library/standard_gates/equivalence_library.py b/qiskit/circuit/library/standard_gates/equivalence_library.py
--- a/qiskit/circuit/library/standard_gates/equivalence_library.py
+++ b/qiskit/circuit/library/standard_gates/equivalence_library.py
@@ -354,6 +354,13 @@
def_ry.append(RGate(theta, ... | Add translation to RX(X) basis to equivalence library
### What is the expected enhancement?
Allow using `RX` and `RXX` as basis gates by adding appropriate entries to the equivalence library:
* `RZ = H RX H` (and `RZZ = HH RXX HH`)
* `RY = Sdg RX S` (and equivalently for 2 qubits)
Currently this translation i... | `basis_gates=["rx", "rz"]` works.
So I agree that it should be enough to add the equivalence relations you have highlighted. | 2021-12-13T23:10:33Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/jul/Qiskit/qiskit-terra/qiskit/compiler/transpiler.py", line 319, in transpile
circuits = parallel_map(_transpile_circuit, list(zip(circuits, transpile_args)))
File "/Users/jul/Qiskit/qiskit-terra/qiskit/tools/parallel.py... | 1,937 | |||
Qiskit/qiskit | Qiskit__qiskit-8321 | e9f2a2b1975bce588a33a4675e8dcce3920b3493 | diff --git a/qiskit/visualization/latex.py b/qiskit/visualization/latex.py
--- a/qiskit/visualization/latex.py
+++ b/qiskit/visualization/latex.py
@@ -420,7 +420,7 @@ def _build_latex_array(self):
for node in layer:
op = node.op
num_cols_op = 1
- wire_list =... | mpl circ drawer fails with `idle_wires=False`
### Environment
- **Qiskit Terra version**: latest
- **Python version**:
- **Operating system**:
### What is happening?
```python
qc = QuantumCircuit(6, 5)
qc.x(5)
qc.h(range(6))
qc.cx(range(5),5)
qc.h(range(5))
qc.measure(range(5), range(5))
backend = p... | A self-contained reproducer for assisting debugging:
```python
from qiskit import QuantumCircuit, transpile
from qiskit.providers.fake_provider import FakeKolkata
qc = QuantumCircuit(6, 5)
qc.x(5)
qc.h(range(6))
qc.cx(range(5),5)
qc.h(range(5))
qc.measure(range(5), range(5))
trans_qc = transpile(qc, Fak... | 2022-07-11T16:27:45Z | [] | [] |
Traceback (most recent call last):
Input In [9] in <cell line: 2>
trans_qc.draw('mpl', idle_wires=False)
File /opt/miniconda3/envs/qiskit/lib/python3.10/site-packages/qiskit/circuit/quantumcircuit.py:1907 in draw
return circuit_drawer(
File /opt/miniconda3/envs/qiskit/lib/python3.10/site-packages/qi... | 2,047 | |||
Qiskit/qiskit | Qiskit__qiskit-885 | 17fda19325960d1c17912bb34744f24f73f7469f | diff --git a/qiskit/_compositegate.py b/qiskit/_compositegate.py
--- a/qiskit/_compositegate.py
+++ b/qiskit/_compositegate.py
@@ -15,7 +15,7 @@
class CompositeGate(Gate):
"""Composite gate, a sequence of unitary gates."""
- def __init__(self, name, param, args, circuit=None):
+ def __init__(self, name, p... | Using simulator instructions breaks latex circuit drawer
### What is the current behavior?
Using circuit simulator instructions (such as [`snapshot`](https://qiskit.org/documentation/_autodoc/qiskit._quantumcircuit.html?highlight=snapshot#qiskit._quantumcircuit.QuantumCircuit.snapshot)) cause those instructions to be ... | Also for the experiment.
@ajavadia can you provide the steps to reproduce and the expected behavior, please?
Since is in my bug I will. But it is pretty clear from the description.
```python
from qiskit import QuantumRegister, ClassicalRegister, QuantumCircuit
from qiskit.tools.visualization import circuit_drawer... | 2018-09-10T18:53:12Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/salva/workspace/qiskit-sdk-py/qiskit/tools/visualization/_circuit_visualization.py", line 77, in circuit_drawer
return latex_circuit_drawer(circuit, basis, scale, filename)
File "/Users/salva/workspace/qiskit-sdk-py/qiski... | 2,112 | |||
Qiskit/qiskit | Qiskit__qiskit-8978 | de8e4dd83627a63fb00f3c8f65bd1dd8bb7e2eac | diff --git a/qiskit/transpiler/passes/layout/vf2_layout.py b/qiskit/transpiler/passes/layout/vf2_layout.py
--- a/qiskit/transpiler/passes/layout/vf2_layout.py
+++ b/qiskit/transpiler/passes/layout/vf2_layout.py
@@ -154,6 +154,11 @@ def run(self, dag):
if len(cm_graph) == len(im_graph):
cho... | VF2 layout fails with cryptic error message when instruction properties are not available
### Environment
- **Qiskit Terra version**: 0.22.0
- **Python version**: 3.9.12
- **Operating system**: Windows 10
### What is happening?
```
Traceback (most recent call last):
File ".\src\playground.py", line 87, in <m... | Yeah, this is a bug `VF2Layout`, should operate even if there are no error rates present in the target. In those cases it just shouldn't do the heuristic scoring and just pick the first isomorphic subgraph it finds.
Just for posterity the `VF2PostLayout` pass is different and does require scores, but in that case it... | 2022-10-20T21:07:23Z | [] | [] |
Traceback (most recent call last):
File ".\src\playground.py", line 87, in <module>
transpile(qc, target=target, optimization_level=2)
File ".\lib\site-packages\qiskit\compiler\transpiler.py", line 382, in transpile
_serial_transpile_circuit(
File ".\lib\site-packages\qiskit\compiler\transpiler.py",... | 2,132 | |||
Qiskit/qiskit | Qiskit__qiskit-8989 | de35b90ddb2f753066070c75a512291087a10968 | diff --git a/qiskit/algorithms/eigensolvers/vqd.py b/qiskit/algorithms/eigensolvers/vqd.py
--- a/qiskit/algorithms/eigensolvers/vqd.py
+++ b/qiskit/algorithms/eigensolvers/vqd.py
@@ -91,13 +91,16 @@ class VQD(VariationalAlgorithm, Eigensolver):
optimizer(Optimizer): A classical optimizer. Can either be a Q... | VQD with primitives for k > 2 fails to validate parameter values
### Environment
- **Qiskit Terra version**: 0.22.0
- **Python version**: 3.9.6
- **Operating system**: Fedora 36
### What is happening?
When running `VQD` with primitives, the program fails to validate parameter values when calculating fidel... | @ElePT | 2022-10-25T12:34:02Z | [] | [] |
Traceback (most recent call last):
File "/home/joel/Desktop/electronic-structure-methods/test/VQD/VQD_ bugtest.py", line 32, in <module>
result = vqd_instance.compute_eigenvalues(operator=qubit_op)
File "/home/joel/miniconda3/envs/Qiskit-0390/lib/python3.9/site-packages/qiskit/algorithms/eigensolvers/vqd.py... | 2,134 | |||
Qiskit/qiskit | Qiskit__qiskit-8995 | 7db16a5a7f87666dee9e32164a4bbf42bf6a13ea | diff --git a/qiskit/transpiler/target.py b/qiskit/transpiler/target.py
--- a/qiskit/transpiler/target.py
+++ b/qiskit/transpiler/target.py
@@ -1043,24 +1043,25 @@ def target_to_backend_properties(target: Target):
continue
qubit = qargs[0]
props_list = []
- ... | Transpiling fails if the target does not specify properties for measurement instructions
### Environment
- **Qiskit Terra version**: 0.22.0
- **Python version**: 3.9.12
- **Operating system**: Windows 10
### What is happening?
```
Traceback (most recent call last):
File ".\src\playground.py", line 86, in <mo... | 2022-10-25T19:23:19Z | [] | [] |
Traceback (most recent call last):
File ".\src\playground.py", line 86, in <module>
qc_transpiled = transpile(qc, target=target)
File ".\lib\site-packages\qiskit\compiler\transpiler.py", line 327, in transpile
unique_transpile_args, shared_args = _parse_transpile_args(
File ".\lib\site-packages\qisk... | 2,135 | ||||
Qiskit/qiskit | Qiskit__qiskit-8997 | 04fd2f2878f0cec162c65da7de3c50abc7d0bc00 | diff --git a/qiskit/transpiler/passes/layout/vf2_layout.py b/qiskit/transpiler/passes/layout/vf2_layout.py
--- a/qiskit/transpiler/passes/layout/vf2_layout.py
+++ b/qiskit/transpiler/passes/layout/vf2_layout.py
@@ -154,6 +154,11 @@ def run(self, dag):
if len(cm_graph) == len(im_graph):
cho... | VF2 layout fails with cryptic error message when instruction properties are not available
### Environment
- **Qiskit Terra version**: 0.22.0
- **Python version**: 3.9.12
- **Operating system**: Windows 10
### What is happening?
```
Traceback (most recent call last):
File ".\src\playground.py", line 87, in <m... | Yeah, this is a bug `VF2Layout`, should operate even if there are no error rates present in the target. In those cases it just shouldn't do the heuristic scoring and just pick the first isomorphic subgraph it finds.
Just for posterity the `VF2PostLayout` pass is different and does require scores, but in that case it... | 2022-10-25T21:51:38Z | [] | [] |
Traceback (most recent call last):
File ".\src\playground.py", line 87, in <module>
transpile(qc, target=target, optimization_level=2)
File ".\lib\site-packages\qiskit\compiler\transpiler.py", line 382, in transpile
_serial_transpile_circuit(
File ".\lib\site-packages\qiskit\compiler\transpiler.py",... | 2,136 | |||
Qiskit/qiskit | Qiskit__qiskit-9020 | 8eed4fa4a63fe36b7299364c42ffe1dfb144e146 | diff --git a/qiskit/transpiler/target.py b/qiskit/transpiler/target.py
--- a/qiskit/transpiler/target.py
+++ b/qiskit/transpiler/target.py
@@ -1043,24 +1043,25 @@ def target_to_backend_properties(target: Target):
continue
qubit = qargs[0]
props_list = []
- ... | Transpiling fails if the target does not specify properties for measurement instructions
### Environment
- **Qiskit Terra version**: 0.22.0
- **Python version**: 3.9.12
- **Operating system**: Windows 10
### What is happening?
```
Traceback (most recent call last):
File ".\src\playground.py", line 86, in <mo... | 2022-10-28T05:28:33Z | [] | [] |
Traceback (most recent call last):
File ".\src\playground.py", line 86, in <module>
qc_transpiled = transpile(qc, target=target)
File ".\lib\site-packages\qiskit\compiler\transpiler.py", line 327, in transpile
unique_transpile_args, shared_args = _parse_transpile_args(
File ".\lib\site-packages\qisk... | 2,143 | ||||
Qiskit/qiskit | Qiskit__qiskit-9076 | 599b663e694f01f514aebbc16c1f13ebd6dade78 | diff --git a/qiskit/primitives/backend_estimator.py b/qiskit/primitives/backend_estimator.py
--- a/qiskit/primitives/backend_estimator.py
+++ b/qiskit/primitives/backend_estimator.py
@@ -147,6 +147,9 @@ def __new__( # pylint: disable=signature-differs
self = super().__new__(cls)
return self
+ de... | Backend based primitives are not serializable via dill
### Environment
- **Qiskit Terra version**: 0.22
- **Python version**: 3.7
- **Operating system**: Any
### What is happening?
Backend based primitives can't be saved via dill. Reference primitives can be loaded/saved. `QuantumInstance` is also seriali... | Adding the following method to BackendSampler should fix this issue.
```python
def __getnewargs__(self):
return self._backend,
```
This should be fixed in Terra, but Terra 0.22.1 has been just released...
Workaround
```python
def __getnewargs__(self):
return self._backend,
BackendSampler.__... | 2022-11-04T13:46:52Z | [] | [] |
Traceback (most recent call last):
File "__dill_primitives.py", line 10, in <module>
sampler = dill.load(f)
File ".../envs/dev-qml/lib/site-packages/dill/_dill.py", line 313, in load
return Unpickler(file, ignore=ignore, **kwds).load()
File ".../envs/dev-qml/lib/site-packages/dill/_dill.py", line 52... | 2,150 | |||
Qiskit/qiskit | Qiskit__qiskit-9101 | 27da80d03f112b6225c80038e37169577bb8acd2 | diff --git a/qiskit/algorithms/eigensolvers/numpy_eigensolver.py b/qiskit/algorithms/eigensolvers/numpy_eigensolver.py
--- a/qiskit/algorithms/eigensolvers/numpy_eigensolver.py
+++ b/qiskit/algorithms/eigensolvers/numpy_eigensolver.py
@@ -20,11 +20,12 @@
from scipy import sparse as scisparse
from qiskit.opflow impo... | `NumPyEigensolver` does not support all `BaseOperator` instances.
### Environment
- **Qiskit Terra version**: 3ce1737b
- **Python version**: 3.10.6
- **Operating system**: macOS 13.0
### What is happening?
`NumPyEigensolver` and by extension `NumPyMinimumEigensolver` do not support all `BaseOperator` instances, ... | I'm happy to work on this issue myself. | 2022-11-08T16:48:24Z | [] | [] |
Traceback (most recent call last):
File "/Users/declanmillar/Projects/qiskit/qiskit-terra/test.py", line 10, in <module>
result = npe.compute_eigenvalues(op)
File "/Users/declanmillar/Projects/qiskit/qiskit-terra/qiskit/algorithms/eigensolvers/numpy_eigensolver.py", line 248, in compute_eigenvalues
sel... | 2,159 | |||
Qiskit/qiskit | Qiskit__qiskit-9149 | fbf5284510b59909e2ddb14ad0121be9777892e0 | diff --git a/qiskit/primitives/utils.py b/qiskit/primitives/utils.py
--- a/qiskit/primitives/utils.py
+++ b/qiskit/primitives/utils.py
@@ -12,9 +12,10 @@
"""
Utility functions for primitives
"""
-
from __future__ import annotations
+import numpy as np
+
from qiskit.circuit import Instruction, ParameterExpression... | Sampler fails on gates with unhashable parameters
### Environment
- **Qiskit Terra version**: fbff44bfe9ebc9d97203929b1bff5483fe06028a
- **Python version**: Python 3.10.8
- **Operating system**: Arch Linux
### What is happening?
See title.
### How can we reproduce the issue?
```python
import numpy as np
fr... | 2022-11-17T03:25:38Z | [] | [] |
Traceback (most recent call last):
File "/home/kjs/projects/qiskit-terra/scratch/sampler_hash_bug.py", line 14, in <module>
sampler_result = sampler.run([circuit]).result()
File "/home/kjs/projects/qiskit-terra/qiskit/primitives/base/base_sampler.py", line 184, in run
return self._run(
File "/home/k... | 2,165 | ||||
Qiskit/qiskit | Qiskit__qiskit-9310 | 244400ad19278125e39a4e35806a79fd9b824655 | diff --git a/qiskit/utils/classtools.py b/qiskit/utils/classtools.py
--- a/qiskit/utils/classtools.py
+++ b/qiskit/utils/classtools.py
@@ -25,11 +25,6 @@
_MAGIC_STATICMETHODS = {"__new__"}
_MAGIC_CLASSMETHODS = {"__init_subclass__", "__prepare__"}
-# `type` itself has several methods (mostly dunders). When we are ... | `qiskit.test.decorators.enforce_subclasses_call` machinery is broken in 3.11.1
### Environment
- **Qiskit Terra version**:
HEAD of main, currently `dd7f9390cf07`
- **Python version**:
3.11.1
- **Operating system**:
Both macOS and ubuntu, so far.
### What is happening?
`import test.python` r... | I'm having trouble reproducing this using only `wrap_method`. For example, the following works as expected:
```python
import qiskit.utils
class Foo:
pass
def print_blerg(*_):
print("blerg")
qiskit.utils.classtools.wrap_method(Foo, "__init_subclass__", after=print_blerg)
class Goo(Foo):
pa... | 2022-12-20T16:34:01Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/ihincks/ibm/qiskit-terra/test/__init__.py", line 16, in <module>
from qiskit.test.utils import generate_cases
File "/Users/ihincks/ibm/qiskit-terra/qiskit/test/__init__.py", line 15, in <module>
from .base import Qis... | 2,181 | |||
Qiskit/qiskit | Qiskit__qiskit-9726 | 3284ea088bbdc3b914dc2be98d150a31e04322dc | diff --git a/qiskit/quantum_info/operators/symplectic/pauli.py b/qiskit/quantum_info/operators/symplectic/pauli.py
--- a/qiskit/quantum_info/operators/symplectic/pauli.py
+++ b/qiskit/quantum_info/operators/symplectic/pauli.py
@@ -147,7 +147,8 @@ class initialization (``Pauli('-iXYZ')``). A ``Pauli`` object can be
... | `Pauli('')` confusion
### Environment
- **Qiskit Terra version**: 0.23.2
- **Python version**: 3.9.7
- **Operating system**: Linux
### What is happening?
I find the following code example puzzling:
```python
>>> from qiskit.quantum_info import Pauli
>>> Pauli("X")[[]]
Pauli('')
>>> Pauli('')
Traceb... | 2023-03-04T01:37:29Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/garrison/serverless/.direnv/python-3.9.7/lib/python3.9/site-packages/qiskit/quantum_info/operators/symplectic/pauli.py", line 182, in __init__
base_z, base_x, base_phase = self._from_label(data)
File "/home/garrison/server... | 2,226 | ||||
Qiskit/qiskit | Qiskit__qiskit-9777 | 5ce80ab45bfdec0f300d4f2095d4fc8dfe3eaae6 | diff --git a/qiskit/circuit/quantumcircuit.py b/qiskit/circuit/quantumcircuit.py
--- a/qiskit/circuit/quantumcircuit.py
+++ b/qiskit/circuit/quantumcircuit.py
@@ -1623,7 +1623,6 @@ def qasm(
"sx",
"sxdg",
"cz",
- "ccz",
"cy",
"swap",
... | ccz roundtrip in OpenQASM 2 broken
@ryahill1 notices that roundtrip in OpenQASM2 for several gates is broken (Originally posted in https://github.com/Qiskit/qiskit-terra/issues/9559#issuecomment-1424824806)
One of those cases is [ccz](https://github.com/Qiskit/qiskit-terra/blob/d4e7144efa9c661817161f84553313bf39406f... | (Copied over): this doesn't need a separate issue. I commented the fix for #9559 at the bottom, and it's trivially the same for the three concerned gates.
The bug here is actually in the OpenQASM 2 output rather than the input - Qiskit should have emitted a file that includes a definition for `csdg`, because it's not ... | 2023-03-10T20:17:09Z | [] | [] |
Traceback (most recent call last):
...
...
File ".../lib/python3.9/site-packages/qiskit/qasm/qasmparser.py", line 138, in verify_as_gate
raise QasmError(
qiskit.qasm.exceptions.QasmError: "Cannot find gate definition for 'csdg', line 4 file line 4 file test.qasm"
| 2,230 | |||
Qiskit/qiskit | Qiskit__qiskit-9786 | 2ce129a14279a746d309f00e311b930ddbfe633c | diff --git a/qiskit/transpiler/passes/utils/gate_direction.py b/qiskit/transpiler/passes/utils/gate_direction.py
--- a/qiskit/transpiler/passes/utils/gate_direction.py
+++ b/qiskit/transpiler/passes/utils/gate_direction.py
@@ -166,6 +166,8 @@ def _run_coupling_map(self, dag, wire_map, edges=None):
cont... | Routing pass does not account for calibrations when assessing gate direction
### Environment
- **Qiskit Terra version**: 0.23.2
- **Python version**: 3.10.9
- **Operating system**: Fedora Linux 37
### What is happening?
When transpiling a circuit with a custom gate with a calibration, the gate direction pass doe... | 2023-03-13T14:52:14Z | [] | [] |
Traceback (most recent call last):
File "/reverse.py", line 16, in <module>
transpile(circ, target=target, optimization_level=1)
File "/lib/pyt... | 2,231 | ||||
Qiskit/qiskit | Qiskit__qiskit-9789 | 648da26c6a0dc6fa9710c639e9f37d96ce426ea0 | diff --git a/qiskit/compiler/transpiler.py b/qiskit/compiler/transpiler.py
--- a/qiskit/compiler/transpiler.py
+++ b/qiskit/compiler/transpiler.py
@@ -645,6 +645,11 @@ def _parse_transpile_args(
timing_constraints = target.timing_constraints()
if backend_properties is None:
backend_pr... | Gates with custom pulses fail to transpile with the new provider
### Environment
- **Qiskit Terra version**: 0.41.1
- **Python version**: 3.9.12
- **Operating system**: Linux
### What is happening?
Defining a new gate, and attaching pulses to it, triggers an error during transpilation. The same code works well w... | I didn't check the main branch.
It's known issue https://github.com/Qiskit/qiskit-terra/issues/9489. This will be fixed in next release.
Sorry this is another issue from #9489. One must fix
https://github.com/Qiskit/qiskit-terra/blob/6829bb18cf791960896fe72b9be9611aac44155a/qiskit/transpiler/passes/basis/unroll_cus... | 2023-03-13T19:46:21Z | [] | [] |
Traceback (most recent call last):
File "/mnt/c/Users/143721756/wsl/balagan/szxbug.py", line 31, in <module>
circ = transpile(circ, backend, inst_map=inst_map, basis_gates=["newgate"])
File "/home/yaelbh/miniconda3/envs/env1/lib/python3.9/site-packages/qiskit/compiler/transpiler.py", line 381, in transpile
... | 2,232 | |||
Qiskit/qiskit | Qiskit__qiskit-9792 | 44cda51974e29fc72fa7e428a14b00af48b32562 | diff --git a/qiskit/compiler/transpiler.py b/qiskit/compiler/transpiler.py
--- a/qiskit/compiler/transpiler.py
+++ b/qiskit/compiler/transpiler.py
@@ -645,6 +645,11 @@ def _parse_transpile_args(
timing_constraints = target.timing_constraints()
if backend_properties is None:
backend_pr... | Gates with custom pulses fail to transpile with the new provider
### Environment
- **Qiskit Terra version**: 0.41.1
- **Python version**: 3.9.12
- **Operating system**: Linux
### What is happening?
Defining a new gate, and attaching pulses to it, triggers an error during transpilation. The same code works well w... | I didn't check the main branch.
It's known issue https://github.com/Qiskit/qiskit-terra/issues/9489. This will be fixed in next release.
Sorry this is another issue from #9489. One must fix
https://github.com/Qiskit/qiskit-terra/blob/6829bb18cf791960896fe72b9be9611aac44155a/qiskit/transpiler/passes/basis/unroll_cus... | 2023-03-14T14:19:05Z | [] | [] |
Traceback (most recent call last):
File "/mnt/c/Users/143721756/wsl/balagan/szxbug.py", line 31, in <module>
circ = transpile(circ, backend, inst_map=inst_map, basis_gates=["newgate"])
File "/home/yaelbh/miniconda3/envs/env1/lib/python3.9/site-packages/qiskit/compiler/transpiler.py", line 381, in transpile
... | 2,234 | |||
apache/airflow | apache__airflow-1056 | b52e89203248a89d3d8f3662c5f440aeba2e025a | diff --git a/airflow/operators/bash_operator.py b/airflow/operators/bash_operator.py
--- a/airflow/operators/bash_operator.py
+++ b/airflow/operators/bash_operator.py
@@ -1,7 +1,6 @@
from builtins import bytes
import logging
-import sys
from subprocess import Popen, STDOUT, PIPE
from tempfile import gettempdir, N... | UnicodeDecodeError in bash_operator.py
Hi,
I see a lot of these errors when running `airflow backfill` :
```
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/__init__.py", line 851, in emit
msg = self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 724, in format
ret... | I encountered the same problem. I have hacked the [line#73 in airflow/operators/bash_operator.py](https://github.com/airbnb/airflow/blob/master/airflow/operators/bash_operator.py#L73) to
`line = line.decode('utf-8').strip()` to fix my problem.
error log
```
[2015-12-22 18:17:51,354] {bash_operator.py:70} INFO - Outp... | 2016-02-22T07:11:38Z | [] | [] |
Traceback (most recent call last):
File "/usr/lib/python2.7/logging/__init__.py", line 851, in emit
msg = self.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 724, in format
return fmt.format(record)
File "/usr/lib/python2.7/logging/__init__.py", line 467, in format
s = self._fmt %... | 2,257 | |||
apache/airflow | apache__airflow-11509 | 31dc6cf82734690bf95ade4554e7ebb183055311 | diff --git a/provider_packages/refactor_provider_packages.py b/provider_packages/refactor_provider_packages.py
--- a/provider_packages/refactor_provider_packages.py
+++ b/provider_packages/refactor_provider_packages.py
@@ -432,6 +432,70 @@ def amazon_package_filter(node: LN, capture: Capture, filename: Filename) -> boo... | Elasticsearch Backport Provider Incompatible with Airflow 1.10.12
**Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.9
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release):
- **Kernel*... | This looks like a real problem. Should I assign you to this issue? Do you want to deal with it?
I can take a stab - I think we just need to backport some of the type handling from `airflow/utils/log/file_task_handler.py` on `master` to the 1.10 branch
@marcusianlevine What do you think to vendorize the `file_task_hand... | 2020-10-13T17:06:51Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,272 | |||
apache/airflow | apache__airflow-11723 | b946b4487086f6e1ed5e2ddf45fa258315d77a50 | diff --git a/airflow/cli/commands/task_command.py b/airflow/cli/commands/task_command.py
--- a/airflow/cli/commands/task_command.py
+++ b/airflow/cli/commands/task_command.py
@@ -171,6 +171,7 @@ def task_run(args, dag=None):
task = dag.get_task(task_id=args.task_id)
ti = TaskInstance(task, args.execution_da... | All task logging goes to the log for try_number 1
**Apache Airflow version**: 2.0.0a1
**What happened**:
When a task fails on the first try, the log output for additional tries go to the log for the first attempt.
**What you expected to happen**:
The logs should go to the correct log file. For the default con... | I can confirm that changing
https://github.com/apache/airflow/blob/172820db4d2009dd26fa8aef4a864fb8a3d7e78d/airflow/cli/commands/task_command.py#L172-L174
to
```
task = dag.get_task(task_id=args.task_id)
ti = TaskInstance(task, args.execution_date)
ti.refresh_from_db()
ti.init_run_context(r... | 2020-10-21T15:46:24Z | [] | [] |
Traceback (most recent call last):
[...]
ValueError: Shan't
| 2,279 | |||
apache/airflow | apache__airflow-11753 | f603b36aa4a07bf98ebe3b1c81676748173b8b57 | diff --git a/airflow/www/utils.py b/airflow/www/utils.py
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -17,12 +17,14 @@
# under the License.
import json
import time
+from typing import Any, List, Optional
from urllib.parse import urlencode
import markdown
import sqlalchemy as sqla
from flask import ... | WebUI: Action on selection in task instance list yields an error
**Apache Airflow version**: v2.0.0.dev0 (latest master)
**Environment**:
- **OS**: Ubuntu 18.04.4 LTS
- **Others**: Python 3.6.9
**What happened**:
Selecting a task in the the **task instance list** (*http:localhost:8080/taskinstance/list/*) an... | Thanks for opening your first issue here! Be sure to follow the issue template!
Any ideas here @ashb @mik-laj @ryanahamilton ? I was able to reproduce this
I was able to reproduce as well. The UI refresh updates didn't touch anything related to this. I thought the recent FAB upgrade from 3.0 to 3.1 could be the culpri... | 2020-10-22T22:18:47Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_excepti... | 2,282 | |||
apache/airflow | apache__airflow-12240 | 45587a664433991b01a24bf0210116c3b562adc7 | diff --git a/airflow/api_connexion/__init__.py b/airflow/api_connexion/__init__.py
new file mode 100644
--- /dev/null
+++ b/airflow/api_connexion/__init__.py
@@ -0,0 +1,16 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements. See the NOTICE file
+# distributed with... | Airflow v2.0.0b1 package doesnt include "api_connexion/exceptions"
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in f... | Thanks for opening your first issue here! Be sure to follow the issue template!
cc: @ashb @kaxil -> looking at it
I downloaded the packages externally from [here](https://pypi.org/project/apache-airflow/2.0.0b1/#files), this one also doesn't contain anything except the `openapi` folder.
> I downloaded the packages ex... | 2020-11-10T10:52:45Z | [] | [] |
Traceback (most recent call last):
File "/Users/abagri/Workspace/service-workflows/venv/bin/airflow", line 8, in <module>
sys.exit(main())
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/Users/abagri/Works... | 2,295 | |||
apache/airflow | apache__airflow-1242 | a69df7f84b620109b03db7be6d657b3fe6f52e0d | diff --git a/airflow/hooks/postgres_hook.py b/airflow/hooks/postgres_hook.py
--- a/airflow/hooks/postgres_hook.py
+++ b/airflow/hooks/postgres_hook.py
@@ -11,7 +11,7 @@ class PostgresHook(DbApiHook):
'''
conn_name_attr = 'postgres_conn_id'
default_conn_name = 'postgres_default'
- supports_autocommit =... | GenericTransfer and Postgres - ERROR - SET AUTOCOMMIT TO OFF is no longer supported
Trying to implement a generic transfer
``` python
t1 = GenericTransfer(
task_id = 'copy_small_table',
sql = "select * from my_schema.my_table",
destination_table = "my_schema.my_table",
source_conn_id = "postgres9.1.13",
dest... | We don't run postgres at Airbnb so I can't really test a fix, but the first thing I'd try would be to change that line to `False`.
https://github.com/airbnb/airflow/blob/master/airflow/hooks/postgres_hook.py#L12
As a side note, autocommit in DbApiHook should probably be set in a different way, perhaps a `set_autocommi... | 2016-03-29T11:34:01Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 977, in run
result = task_copy.execute(context=context)
File "/usr/local/lib/python2.7/dist-packages/airflow/operators/generic_transfer.py", line 64, in execute
destination_hook.insert_rows(table=self.d... | 2,301 | |||
apache/airflow | apache__airflow-1247 | 5bda74fd9c36f524d0ee922f2183ce9795cc6562 | diff --git a/airflow/utils/db.py b/airflow/utils/db.py
--- a/airflow/utils/db.py
+++ b/airflow/utils/db.py
@@ -42,10 +42,14 @@ def provide_session(func):
@wraps(func)
def wrapper(*args, **kwargs):
needs_session = False
- if 'session' not in kwargs:
+ arg_session = 'session'
+ fun... | Scheduler Pickling - conflict with 'session' parameter defined in both args and kwargs
Dear Airflow Maintainers,
Before I tell you about my issue, let me describe my environment:
# Environment
- **Version of Airflow:** master (1db892b)
- **Example code to reproduce the bug:** Using airflow cli: `airflow scheduler -p`
... | 2016-03-29T15:37:01Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow-1.6.2-py2.7.egg/airflow/jobs.py", line 653, in _execute
self.process_dag(dag, executor)
File "/usr/local/lib/python2.7/dist-packages/airflow-1.6.2-py2.7.egg/airflow/jobs.py", line 459, in process_dag
pickle_id = dag.pic... | 2,303 | ||||
apache/airflow | apache__airflow-12595 | ce919912b7ead388c0a99f4254e551ae3385ff50 | diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -1149,7 +1149,8 @@ def _run_raw_task(
session.commit()
- self._run_mini_scheduler_on_child_tasks(session)
+ if not test_mode:
+ ... | airflow task test failing due to mini scheduler implementation not respecting test mode
**Apache Airflow version**: 2.0.0b3
**Environment**: Python3.7-slim running on docker
**What happened**:
Error when running `airflow tasks test <dag> <task> <date>` there is an error with the task rescheduler, which should ... | Thanks for opening your first issue here! Be sure to follow the issue template!
@ashb this issue is the same as https://github.com/apache/airflow/issues/12584 raised by @nathadfield, issues created 1 minute apart.
@AdilsonMendonca Sounds like you've done a better job of explaining it than I. We can close mine. | 2020-11-24T18:06:41Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.7/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 50, in command
... | 2,304 | |||
apache/airflow | apache__airflow-1261 | 91449a2205ce2c48596416e3207c2b6a26055a8a | diff --git a/airflow/jobs.py b/airflow/jobs.py
--- a/airflow/jobs.py
+++ b/airflow/jobs.py
@@ -401,7 +401,9 @@ def schedule_dag(self, dag):
DagRun.run_id.like(DagRun.ID_PREFIX+'%')))
last_scheduled_run = qry.scalar()
next_run_date = None
- if not last_sc... | DAG with schedule interval '@once' cannot be scheduled
Dear Airflow Maintainers,
Before I tell you about my issue, let me describe my environment:
# Environment
- Version of Airflow (e.g. a release version, running your own fork, running off master -- provide a git log snippet) : **1.7.0**
- Example code to reproduce ... | 2016-03-30T16:30:26Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 638, in _execute
self.schedule_dag(dag)
File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py", line 397, in schedule_dag
next_run_date = dag.date_range(latest_run, -5)[0]
IndexError: list index ou... | 2,305 | ||||
apache/airflow | apache__airflow-13260 | c2bedd580c3dd0e971ac394be25e331ba9c1c932 | diff --git a/airflow/configuration.py b/airflow/configuration.py
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -199,7 +199,7 @@ def __init__(self, default_config=None, *args, **kwargs):
self.is_validated = False
- def _validate(self):
+ def validate(self):
self._validate... | Import error when using custom backend and sql_alchemy_conn_secret
**Apache Airflow version**: 2.0.0
**Environment**:
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): custom Docker image (`FROM python:3.6`) and macOS Big Sur (11.0.1)
- **Kernel** (e.g. `uname -a`):
... | Thanks for opening your first issue here! Be sure to follow the issue template!
It looks like you have somethng seriously wrong in your configuration -- looks like the line numbers reported do not match the line numbers from Airlfow installation. can you please remove/reinstall airflow from the scratch and see again? ... | 2020-12-22T19:05:22Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 34, in <module>
from airflow import settings
File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 35, in <module>
from airflow.configurati... | 2,323 | |||
apache/airflow | apache__airflow-13371 | 800e630d0cc9dbbf345a9cee4653861cbfda42c9 | diff --git a/airflow/upgrade/rules/airflow_macro_plugin_removed.py b/airflow/upgrade/rules/airflow_macro_plugin_removed.py
--- a/airflow/upgrade/rules/airflow_macro_plugin_removed.py
+++ b/airflow/upgrade/rules/airflow_macro_plugin_removed.py
@@ -39,9 +39,12 @@ def _check_file(self, file_path):
problems = []
... | AirflowMacroPluginRemovedRule fails on non-python files
**Apache Airflow version**: 1.10.14
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**: X
- **OS** (e.g. from /etc/os-release): X
- **Kernel** (e.g. `uname -a`)... | Please feel free to assign this to me too 🙂
Sure :). Do you want to take care about #13350 too ?
I can give it a shot too! I know less about what the best approach is for that one. Also do you mind adding the `upgrade_check` label to these two? | 2020-12-29T21:12:20Z | [] | [] |
Traceback (most recent call last):
File "/Users/madison/programs/anaconda3/envs/memphis-airflow/bin/airflow", line 37, in <module>
args.func(args)
File "/Users/madison/programs/anaconda3/envs/memphis-airflow/lib/python3.8/site-packages/airflow/upgrade/checker.py", line 88, in run
all_problems = check_u... | 2,329 | |||
apache/airflow | apache__airflow-13932 | 7a5aafce08374c75562e3eb728413fefc4ab6e01 | diff --git a/airflow/jobs/scheduler_job.py b/airflow/jobs/scheduler_job.py
--- a/airflow/jobs/scheduler_job.py
+++ b/airflow/jobs/scheduler_job.py
@@ -731,7 +731,7 @@ def __init__(
self.max_tis_per_query: int = conf.getint('scheduler', 'max_tis_per_query')
self.processor_agent: Optional[DagFileProcess... | Unable to start scheduler after stopped
**Apache Airflow version**: 2.0.0rc3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**: Linux
- **OS** (e.g. from /etc/os-release): Ubuntu
- **Kernel** (e.g. `uname -a`):
- **... | Looks serious. If we can confirm that one, I am afraid it might lead to RC4 @kaxil @ashb if this is easily triggerable.
Thanks for reporting @JCoder01 !
I've certainly never seen this, and I've done heave testing of killing and restarting schedulers.
@JCoder01 Few more questions. Maybe you can provide as much informati... | 2021-01-27T19:40:16Z | [] | [] |
Traceback (most recent call last):
File "/home/jcoder/git/airflow_2.0/pyenv/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/home/jcoder/git/airflow_2.0/pyenv/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 5... | 2,339 | |||
apache/airflow | apache__airflow-14274 | 3709503ecf180bd8c85190bcc7e5e755b60d9bfb | diff --git a/airflow/upgrade/rules/postgres_mysql_sqlite_version_upgrade_check.py b/airflow/upgrade/rules/postgres_mysql_sqlite_version_upgrade_check.py
--- a/airflow/upgrade/rules/postgres_mysql_sqlite_version_upgrade_check.py
+++ b/airflow/upgrade/rules/postgres_mysql_sqlite_version_upgrade_check.py
@@ -43,16 +43,23 ... | upgrade_check fails db version check
**Apache Airflow version**: 1.10.14 with AWS RDS mysql 5.7.26 as metastore db
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): v1.16.15
**Environment**: DEV
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-relea... | Thanks for opening your first issue here! Be sure to follow the issue template!
| 2021-02-17T13:18:14Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 37, in <module>
args.func(args)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/upgrade/checker.py", line 118, in run
all_problems = check_upgrade(formatter, rules)
File "/home/airflow/.local/lib/python3.... | 2,343 | |||
apache/airflow | apache__airflow-14869 | 16f43605f3370f20611ba9e08b568ff8a7cd433d | diff --git a/airflow/providers/mysql/hooks/mysql.py b/airflow/providers/mysql/hooks/mysql.py
--- a/airflow/providers/mysql/hooks/mysql.py
+++ b/airflow/providers/mysql/hooks/mysql.py
@@ -18,11 +18,17 @@
"""This module allows to connect to a MySQL database."""
import json
-from typing import Dict, Optional, Tuple
+f... | MySQL hook uses wrong autocommit calls for mysql-connector-python
**Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): n/a
**Environment**:
* **Cloud provider or hardware configuration**: WSL2/Docker running `apache/airflow:2.0.1-python3.7` image
* *... | Thanks for opening your first issue here! Be sure to follow the issue template!
You can know which is used by using `client_name`:
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d746102e12f9620f271391/airflow/providers/mysql/hooks/mysql.py#L140
https://github.com/apache/airflow/blob/2a2adb3f94cc165014d... | 2021-03-18T05:58:43Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1112, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 128... | 2,348 | |||
apache/airflow | apache__airflow-15074 | b4374d33b0e5d62c3510f1f5ac4a48e7f48cb203 | diff --git a/airflow/www/utils.py b/airflow/www/utils.py
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -126,9 +126,11 @@ def generate_pages(current_page, num_of_pages, search=None, status=None, window=
output = [Markup('<ul class="pagination" style="margin-top:0;">')]
is_disabled = 'disabled' if ... | WEB UI, last page button does not work when all dags are in not active state
**Apache Airflow version**:
1.10.10
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
none
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
NAME="Cent... | Hello.
Thanks for reporting a bug. This looks like something worth working on.
Would you like to work on a fix for this bug? We are open to contributions from everyone.
Best regards,
Kamil Breguła
ok, I'll try to write patches to bug I'll find. I'm just back from holidays, just the time to organise my work. | 2021-03-29T18:28:54Z | [] | [] |
Traceback (most recent call last):
File "/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/app.py", line 2446, in wsgi_app
response = self.full_dispatch_request()
File "/jhub/_prod/server_global_unifieddata_hadoop_airflow_daemon/lib/python2.7/site-packages/flask/a... | 2,350 | |||
apache/airflow | apache__airflow-15132 | a6070026576d6a266c8df380a57deea3b43772d5 | diff --git a/airflow/stats.py b/airflow/stats.py
--- a/airflow/stats.py
+++ b/airflow/stats.py
@@ -343,7 +343,7 @@ def timer(self, stat=None, *args, tags=None, **kwargs):
"""Timer metric that can be cancelled"""
if stat and self.allow_list_validator.test(stat):
tags = tags or []
- ... | Enabling Datadog to tag metrics results in AttributeError
**Apache Airflow version**: 2.0.1
**Python version**: 3.8
**Cloud provider or hardware configuration**: AWS
**What happened**:
In order to add tags to [Airflow metrics,](https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/metrics.html)... | Thanks for opening your first issue here! Be sure to follow the issue template!
| 2021-04-01T11:49:50Z | [] | [] |
Traceback (most recent call last):
```
**What you expected to happen**:
The same default Airflow metrics get sent by connecting to datadog, tagged with the metrics specified in `AIRFLOW__METRICS__STATSD_DATADOG_TAGS`.
| 2,356 | |||
apache/airflow | apache__airflow-15212 | 18066703832319968ee3d6122907746fdfda5d4c | diff --git a/airflow/cli/commands/info_command.py b/airflow/cli/commands/info_command.py
--- a/airflow/cli/commands/info_command.py
+++ b/airflow/cli/commands/info_command.py
@@ -15,7 +15,6 @@
# specific language governing permissions and limitations
# under the License.
"""Config sub-commands"""
-import getpass
im... | Don't crash when a getpass.getuser() call fails
**Apache Airflow version**: 1.10.11
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.18.4
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04
- **Kernel** (e.g. `un... | 2021-04-05T19:42:00Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 37, in <module>
args.func(args)
File "/usr/local/lib/python3.8/site-packages/airflow/utils/cli.py", line 76, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/bin/cli.py", line 1189, in sch... | 2,359 | ||||
apache/airflow | apache__airflow-15680 | 30eeac7b7ed4ab5ea191691a3b713e3d66c0baff | diff --git a/airflow/providers/amazon/aws/transfers/mongo_to_s3.py b/airflow/providers/amazon/aws/transfers/mongo_to_s3.py
--- a/airflow/providers/amazon/aws/transfers/mongo_to_s3.py
+++ b/airflow/providers/amazon/aws/transfers/mongo_to_s3.py
@@ -40,7 +40,7 @@ class MongoToS3Operator(BaseOperator):
:param mongo_co... | MongoToS3Operator failed when running with a single query (not aggregate pipeline)
**Apache Airflow version**: 2.0.2
**What happened**:
`MongoToS3Operator` failed when running with a single query (not aggregate pipeline):
```sh
Traceback (most recent call last):
File "/home/airflow//bin/airflow", line 8, i... | 2021-05-05T17:14:15Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow//bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow//lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/home/airflow//lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in ... | 2,369 | ||||
apache/airflow | apache__airflow-15794 | b272f9cec99fd0e3373d23b706f33892cbcb9626 | diff --git a/airflow/providers/apache/spark/hooks/spark_sql.py b/airflow/providers/apache/spark/hooks/spark_sql.py
--- a/airflow/providers/apache/spark/hooks/spark_sql.py
+++ b/airflow/providers/apache/spark/hooks/spark_sql.py
@@ -17,11 +17,14 @@
# under the License.
#
import subprocess
-from typing import Any, List... | Spark SQL Hook not using connections
**Apache Airflow version**: 1.10.10
**What happened**:
`SparkSqlHook` is not using any connection, the default conn_id is `spark_sql_default`, if this connection doesn't exist, the hook returns an error:
```
Traceback (most recent call last):
File "/Users/rbottega/Documen... | Thanks for opening your first issue here! Be sure to follow the issue template!
Can I take this issue?
Hi @danielenricocahall
Sure, go ahead.
I've assigned it to you @danielenricocahall
@danielenricocahall I'm unasinging you as you didn't complete the PR if you wish to finish it let us know.
This issue is open t... | 2021-05-12T09:12:38Z | [] | [] |
Traceback (most recent call last):
File "/Users/rbottega/Documents/airflow_latest/env/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 983, in _run_raw_task
result = task_copy.execute(context=context)
File "/Users/rbottega/Documents/airflow_latest/env/lib/python3.7/site-packages/airflow/con... | 2,371 | |||
apache/airflow | apache__airflow-15822 | 1edef28b315809c8367b29f3f1a83984dc6566c4 | diff --git a/airflow/models/dag.py b/airflow/models/dag.py
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1463,13 +1463,8 @@ def partial_subset(
"""
# deep-copying self.task_dict and self._task_group takes a long time, and we don't want all
# the tasks anyway, so we copy the task... | Task preceeding PythonVirtualenvOperator fails: "cannot pickle 'module' object"
**Apache Airflow version**
13faa6912f7cd927737a1dc15630d3bbaf2f5d4d
**Environment**
- **Configuration**: Local Executor
- **OS** (e.g. from /etc/os-release): Mac OS 11.3
- **Kernel**: Darwin Kernel Version 20.4.0
- **Install too... | I hate pickle :/
Turns out it's nothing really to do with pickle, just to do with trying to copy _any_ module object.
And the horrifying thing? This has been broken since 2.0.0. | 2021-05-13T13:50:32Z | [] | [] |
Traceback (most recent call last):
File "/Users/matt/src/airflow/airflow/executors/debug_executor.py", line 79, in _run_task
ti._run_raw_task(job_id=ti.job_id, **params) # pylint: disable=protected-access
File "/Users/matt/src/airflow/airflow/utils/session.py", line 70, in wrapper
return func(*args, s... | 2,373 | |||
apache/airflow | apache__airflow-16108 | aeb93f8e5bb4a9175e8834d476a6b679beff4a7e | diff --git a/airflow/cli/commands/task_command.py b/airflow/cli/commands/task_command.py
--- a/airflow/cli/commands/task_command.py
+++ b/airflow/cli/commands/task_command.py
@@ -88,6 +88,7 @@ def _run_task_by_executor(args, dag, ti):
print(e)
raise e
executor = ExecutorLoader.get_default... | Could not get scheduler_job_id
**Apache Airflow version:**
2.0.0
**Kubernetes version (if you are using kubernetes) (use kubectl version):**
1.18.3
**Environment:**
Cloud provider or hardware configuration: AWS
**What happened:**
When trying to run a DAG, it gets scheduled, but task is never run. W... | Thanks for opening your first issue here! Be sure to follow the issue template!
Does it only happen with Kubernetes Executor?
Looks like a bug with Kubernetes Executor. Related issue: https://github.com/apache/airflow/issues/13805
> Does it only happen with Kubernetes Executor?
Yes, it happens only with Kubernete... | 2021-05-27T08:00:16Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,382 | |||
apache/airflow | apache__airflow-16118 | 6736290ca3ca31223717825be0ae3625cf7d214c | diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -16,6 +16,7 @@
# under the License.
"""Mask sensitive information from logs"""
import collections
+import io
import logging
import re
from t... | Secret masking fails on io objects
**Apache Airflow version**: 2.1.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**: *NIX
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): N/A
- **Kernel** (e.g. `uname -a`): N/A
- *... | Cc: @ashb | 2021-05-27T14:54:02Z | [] | [] |
Traceback (most recent call last):
File "/Users/madison/programs/anaconda3/envs/ookla-airflow/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1137, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/Users/madison/programs/anaconda3/envs/ookla-airflow/lib/p... | 2,383 | |||
apache/airflow | apache__airflow-16345 | ce28bc52a8d477e137f9293ce0f3f90d4e291883 | diff --git a/airflow/models/serialized_dag.py b/airflow/models/serialized_dag.py
--- a/airflow/models/serialized_dag.py
+++ b/airflow/models/serialized_dag.py
@@ -280,7 +280,7 @@ def get_last_updated_datetime(cls, dag_id: str, session: Session = None) -> Opti
@classmethod
@provide_session
- def get_max_l... | error on click in dag-dependencies - airflow 2.1
Python version: 3.7.9
Airflow version: 2.1.0
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/... | Thanks for opening your first issue here! Be sure to follow the issue template!
@jcmartins thanks for submitting. Do you want to submit a fix as well?
| 2021-06-09T03:45:18Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,385 | |||
apache/airflow | apache__airflow-16383 | e72e5295fd5e710599bc0ecc9a70b0b3b5728f38 | diff --git a/airflow/utils/json.py b/airflow/utils/json.py
--- a/airflow/utils/json.py
+++ b/airflow/utils/json.py
@@ -17,6 +17,7 @@
# under the License.
from datetime import date, datetime
+from decimal import Decimal
import numpy as np
from flask.json import JSONEncoder
@@ -37,12 +38,19 @@ def __init__(self, ... | Airflow Stable REST API [GET api/v1/pools] issue
**Apache Airflow version**: v2.0.2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**: AWS
- **Cloud provider or hardware configuration**: AWS EC2 Instance
- **OS** (e.g. from /etc/os-release): Ubuntu Server 20.04 ... | Thanks for opening your first issue here! Be sure to follow the issue template!
| 2021-06-11T10:50:45Z | [] | [] |
Traceback (most recent call last):
File "/home/tool/gto_env/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/tool/gto_env/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e... | 2,387 | |||
apache/airflow | apache__airflow-16393 | ce28bc52a8d477e137f9293ce0f3f90d4e291883 | diff --git a/airflow/models/serialized_dag.py b/airflow/models/serialized_dag.py
--- a/airflow/models/serialized_dag.py
+++ b/airflow/models/serialized_dag.py
@@ -313,18 +313,12 @@ def get_dag_dependencies(cls, session: Session = None) -> Dict[str, List['DagDep
:param session: ORM Session
:type sessio... | exception when root account goes to http://airflow.ordercapital.com/dag-dependencies
Happens every time
Python version: 3.8.10
Airflow version: 2.1.0
Node: airflow-web-55974db849-5bdxq
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/... | Thanks for opening your first issue here! Be sure to follow the issue template!
I'm guessing this might be related to #16328. Different traceback, but might be the same root cause?
Actually no. The error message suggests `row[1]` is `None`, which is a value returned by Postgres’s `json_extract_path`:
https://github... | 2021-06-11T14:33:52Z | [] | [] |
Traceback (most recent call last):
File "/opt/bitnami/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/opt/bitnami/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_us... | 2,390 | |||
apache/airflow | apache__airflow-16415 | 0c80a7d41100bf8d18b661c8286d6056e6d5d2f1 | diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -559,6 +559,14 @@ def __init__(
if wait_for_downstream:
self.depends_on_past = True
+ if retries is not None and not isinstance(retries... | Unable to clear Failed task with retries
**Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**: Windows WSL2 (Ubuntu) Local
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04
- **K... | I have the same problem with Airflow 2.0.1. I made an update from version 1.10.14 and the clear function is not working anymore. Can we try to fix this issue? I think it's a quite important function.
we also have the same issues on Airflow 1.10.10, Python 3.6. This is happening only on few dags the clear function is no... | 2021-06-12T15:21:00Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_excepti... | 2,392 | |||
apache/airflow | apache__airflow-16491 | e72e5295fd5e710599bc0ecc9a70b0b3b5728f38 | diff --git a/airflow/utils/log/secrets_masker.py b/airflow/utils/log/secrets_masker.py
--- a/airflow/utils/log/secrets_masker.py
+++ b/airflow/utils/log/secrets_masker.py
@@ -111,6 +111,7 @@ class SecretsMasker(logging.Filter):
patterns: Set[str]
ALREADY_FILTERED_FLAG = "__SecretsMasker_filtered"
+ MAX_R... | secrets_masker RecursionError with nested TriggerDagRunOperators
**Apache Airflow version**: 2.1.0
**Environment**: tested on Windows docker-compose envirnoment and on k8s (both with celery executor).
**What happened**:
```
[2021-06-16 07:56:32,682] {taskinstance.py:1481} ERROR - Task failed with exception
... | Thanks for opening your first issue here! Be sure to follow the issue template!
We should implement some kind of cycle detection in the redaction logic.
Maybe simple max depth of recursion. Somewhat arbitrary but trying to solve it 'properly' might be first - unnecessary and secondly - quite a bit too costly for the l... | 2021-06-16T22:02:07Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1137, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 131... | 2,394 | |||
apache/airflow | apache__airflow-17105 | 3a2e162387b5d73f1badda8fcf027fbc2caa0f28 | diff --git a/airflow/models/dag.py b/airflow/models/dag.py
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -75,6 +75,7 @@
from airflow.timetables.simple import NullTimetable, OnceTimetable
from airflow.typing_compat import Literal, RePatternType
from airflow.utils import timezone
+from airflow.utils.dag_c... | dag.cli should detect DAG cycles
**Description**
I wish `dag.cli()` reported cycles in a task graph.
**Use case / motivation**
We use Airflow (now 2.1.1), with about 40 DAGs authored by many people, with daily changes, and put our DAGs into custom docker image that we deploy with flux.
However, I noticed ... | I think this begs a more general question, should Airflow actively check whether a DAG is actually a DAG (directed acyclic graph)? Currently (from what I know) we don’t actually check this, and the cyclic error happens only when the DAG is actually run.
That's a fair point (regarding what Uranus pointed out), but it do... | 2021-07-20T09:27:11Z | [] | [] |
Traceback (most recent call last):
File "/home/witek/code/airflow/dags/primary/examples/tutorial_cycles.py", line 33, in <module>
print(dag.topological_sort())
File "/home/witek/airflow-testing/venv/lib/python3.9/site-packages/airflow/models/dag.py", line 1119, in topological_sort
raise AirflowExceptio... | 2,403 | |||
apache/airflow | apache__airflow-17210 | 87f408b1e78968580c760acb275ae5bb042161db | diff --git a/airflow/providers/amazon/aws/hooks/base_aws.py b/airflow/providers/amazon/aws/hooks/base_aws.py
--- a/airflow/providers/amazon/aws/hooks/base_aws.py
+++ b/airflow/providers/amazon/aws/hooks/base_aws.py
@@ -37,6 +37,7 @@
import tenacity
from botocore.config import Config
from botocore.credentials import ... | AWS Hooks fail when assuming role and connection id contains forward slashes
**Apache Airflow version**: 2.1.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.3", GitCommit:"1e11e4a2108024935ecfcb2912226cedeafd99df"... | Thanks for opening your first issue here! Be sure to follow the issue template!
Are you willing to submit a PR? I'm happy to help with review.
> Are you willing to submit a PR? I'm happy to help with review.
That may be something I can start on next week.
I'd appreciate input on solutions.
My initial thought... | 2021-07-25T17:35:57Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1137, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_e... | 2,404 | |||
apache/airflow | apache__airflow-17539 | 719709b6e994a99ad2cb8f90042a19a7924acb8e | diff --git a/airflow/providers/google/cloud/secrets/secret_manager.py b/airflow/providers/google/cloud/secrets/secret_manager.py
--- a/airflow/providers/google/cloud/secrets/secret_manager.py
+++ b/airflow/providers/google/cloud/secrets/secret_manager.py
@@ -19,11 +19,6 @@
import logging
from typing import Optional
... | google.api_core.exceptions.Unknown: None Stream removed (Snowflake and GCP Secret Manager)
**Apache Airflow version**: 2.1.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **Cloud provider or hardware configuration**: Astronomer-based local setup using Do... | Thanks for opening your first issue here! Be sure to follow the issue template!
I am afraid there isn't much we can do in Airflow to fix it. Closing it. | 2021-08-10T17:47:49Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1137, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_e... | 2,408 | |||
apache/airflow | apache__airflow-18224 | 27144bd36794d3450a337786c84c4ddde9c79da3 | diff --git a/airflow/api_connexion/endpoints/user_endpoint.py b/airflow/api_connexion/endpoints/user_endpoint.py
--- a/airflow/api_connexion/endpoints/user_endpoint.py
+++ b/airflow/api_connexion/endpoints/user_endpoint.py
@@ -21,7 +21,7 @@
from werkzeug.security import generate_password_hash
from airflow.api_conne... | POST /api/v1/users fails with exception
### Apache Airflow version
main (development)
### Operating System
From Astronomer’s QA team
### Versions of Apache Airflow Providers
_No response_
### Deployment
Astronomer
### Deployment details
_No response_
### What happened
When adding a new user, The following ex... | On further investigating, the problem seems to be the email field, F.A.B. seems to have a UNIQUE key on it. I’ll do a PR. | 2021-09-14T04:46:08Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,424 | |||
apache/airflow | apache__airflow-18602 | 324aca410bbbddb22336746338ea8a60ad0b3989 | diff --git a/airflow/www/widgets.py b/airflow/www/widgets.py
--- a/airflow/www/widgets.py
+++ b/airflow/www/widgets.py
@@ -42,10 +42,12 @@ def __call__(self, field, **kwargs):
kwargs.setdefault("id", field.id)
kwargs.setdefault("name", field.name)
if not field.data:
- field.data = ... | Error when querying on the Browse view with empty date picker
**Apache Airflow version**: 2.0.2
**What happened**:
Under Browse, when querying with any empty datetime fields, I received the mushroom cloud.
```
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", li... | I wonder what the best solution is. We can ignore empty fields in the backend (and probably should no matter what), but I would want the frontend to block form submission when this happens since leaving a field empty is likely a user error.
These pages and forms are created in Flask so we'd need to disable it there if ... | 2021-09-29T08:08:31Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,429 | |||
apache/airflow | apache__airflow-18733 | 181ac36db3749050a60fc1f08ceace005c5cb58b | diff --git a/airflow/providers/amazon/aws/operators/ecs.py b/airflow/providers/amazon/aws/operators/ecs.py
--- a/airflow/providers/amazon/aws/operators/ecs.py
+++ b/airflow/providers/amazon/aws/operators/ecs.py
@@ -453,6 +453,10 @@ def _check_success_task(self) -> None:
raise AirflowException(response)
... | When an ECS Task fails to start, ECS Operator raises a CloudWatch exception
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put ... | Thanks for opening your first issue here! Be sure to follow the issue template!
Yes. And Fargate tasks are more likely to experience this `failed-to-start` issue than the conventional EC2 tasks. I've been back and forth with AWS support on this one for a long long time. The AWS's diagnostic is this:
* Randomly AWS ... | 2021-10-05T09:59:20Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/ecs_operator.py", line 152, in execute
self._che... | 2,430 | |||
apache/airflow | apache__airflow-19258 | 61d009305478e76e53aaf43ce07a181ebbd259d3 | diff --git a/airflow/www/views.py b/airflow/www/views.py
--- a/airflow/www/views.py
+++ b/airflow/www/views.py
@@ -2863,6 +2863,7 @@ def gantt(self, session=None):
task_dict['end_date'] = task_dict['end_date'] or timezone.utcnow()
task_dict['extraLinks'] = dag.get_task(ti.task_id).extra_links
... | Task modal links are broken in the dag gantt view
### Apache Airflow version
2.2.0 (latest released)
### Operating System
Debian GNU/Linux 11 (bullseye)
### Versions of Apache Airflow Providers
N/A
### Deployment
Other Docker-based deployment
### Deployment details
CeleryExecutor / ECS / Post... | 2021-10-27T14:33:56Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,445 | ||||
apache/airflow | apache__airflow-19307 | 98d906743689c4e0068db7a8b0d10f2486638a3b | diff --git a/airflow/models/dag.py b/airflow/models/dag.py
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -607,12 +607,12 @@ def previous_schedule(self, dttm):
return None
return self.timetable._get_prev(timezone.coerce_datetime(dttm))
- def get_next_data_interval(self, dag_model:... | "Not a valid timetable" when returning None from next_dagrun_info in a custom timetable
### Apache Airflow version
2.2.0 (latest released)
### Operating System
Mac
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What ... | 2021-10-29T10:18:39Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 623, in _execute
self._run_scheduler_loop()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/jobs/scheduler_job.py", line 704, in _run_scheduler_loop
num_queued_t... | 2,446 | ||||
apache/airflow | apache__airflow-19418 | 5a113f302769f0ecad3a54bad3027d459cb276a4 | diff --git a/airflow/timetables/interval.py b/airflow/timetables/interval.py
--- a/airflow/timetables/interval.py
+++ b/airflow/timetables/interval.py
@@ -271,8 +271,9 @@ def serialize(self) -> Dict[str, Any]:
return {"delta": delta}
def validate(self) -> None:
- if self._delta.total_seconds() <=... | A dag's schedule interval can no longer be an instance of dateutils.relativedelta
### Apache Airflow version
2.2.1 (latest released)
### Operating System
debian
### Versions of Apache Airflow Providers
apache-airflow==2.2.1
apache-airflow-providers-amazon==2.3.0
apache-airflow-providers-ftp==2.0.1
apache-airflo... | 2021-11-05T02:37:03Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 515, in collect_dags
found_dags = self.process_file(filepath, only_if_updated=only_if_updated, safe_mode=safe_mode)
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 298... | 2,448 | ||||
apache/airflow | apache__airflow-19481 | 360474fff3738f70e95580a91e778250afa7ce82 | diff --git a/airflow/api/common/experimental/trigger_dag.py b/airflow/api/common/experimental/trigger_dag.py
--- a/airflow/api/common/experimental/trigger_dag.py
+++ b/airflow/api/common/experimental/trigger_dag.py
@@ -68,10 +68,12 @@ def _trigger_dag(
)
run_id = run_id or DagRun.generate_run_id(Dag... | A dag's schedule interval can no longer be an instance of dateutils.relativedelta
### Apache Airflow version
2.2.1 (latest released)
### Operating System
debian
### Versions of Apache Airflow Providers
apache-airflow==2.2.1
apache-airflow-providers-amazon==2.3.0
apache-airflow-providers-ftp==2.0.1
apache-airflo... | 2021-11-08T20:22:48Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 515, in collect_dags
found_dags = self.process_file(filepath, only_if_updated=only_if_updated, safe_mode=safe_mode)
File "/usr/local/lib/python3.9/site-packages/airflow/models/dagbag.py", line 298... | 2,451 | ||||
apache/airflow | apache__airflow-19668 | 9a246d3fa30439fb2240458dbb220c24214b4831 | diff --git a/airflow/providers/microsoft/azure/operators/container_instances.py b/airflow/providers/microsoft/azure/operators/container_instances.py
--- a/airflow/providers/microsoft/azure/operators/container_instances.py
+++ b/airflow/providers/microsoft/azure/operators/container_instances.py
@@ -199,7 +199,7 @@ def e... | AzureContainerInstancesOperator is not working due to argument error
### Apache Airflow Provider(s)
microsoft-azure
### Versions of Apache Airflow Providers
3.1.0
### Apache Airflow version
2.1.3 (latest released)
### Operating System
Ubuntu
### Deployment
Docker-Compose
### Deployment details
_No response_
... | Thanks for opening your first issue here! Be sure to follow the issue template!
@binhnefits assigned to you
@binhnefits are you still working on this issue?
I can also help with this one.
We are facing the same issue.
Would be great if this could be fixed.
| 2021-11-18T03:12:42Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1164, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 128... | 2,454 | |||
apache/airflow | apache__airflow-19821 | 314a4fe0050783ebb43b300c4c950667d1ddaa89 | diff --git a/airflow/sensors/base.py b/airflow/sensors/base.py
--- a/airflow/sensors/base.py
+++ b/airflow/sensors/base.py
@@ -17,12 +17,14 @@
# under the License.
import datetime
+import functools
import hashlib
import os
import time
from datetime import timedelta
from typing import Any, Callable, Dict, Itera... | Airflow scheduler crashed with TypeError: '>=' not supported between instances of 'datetime.datetime' and 'NoneType'
### Apache Airflow version
2.1.4
### Operating System
Ubuntu 20.04.3 LTS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### De... | Thanks for opening your first issue here! Be sure to follow the issue template!
As per https://dev.mysql.com/doc/refman/5.7/en/datetime.html,
> The TIMESTAMP data type is used for values that contain both date and time parts. TIMESTAMP has a range of '1970-01-01 00:00:01' UTC to '2038-01-19 03:14:07' UTC.
> Invalid... | 2021-11-25T09:07:07Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 695, in _execute
self._run_scheduler_loop()
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 788, in _run_scheduler_loop
num_queued_tis = self._do_schedu... | 2,458 | |||
apache/airflow | apache__airflow-19933 | 5b50d610d4f1288347392fac4a6eaaed78d1bc41 | diff --git a/airflow/task/task_runner/standard_task_runner.py b/airflow/task/task_runner/standard_task_runner.py
--- a/airflow/task/task_runner/standard_task_runner.py
+++ b/airflow/task/task_runner/standard_task_runner.py
@@ -85,12 +85,12 @@ def _start_by_fork(self):
args.func(args, dag=self.dag)
... | Reference to undeclared variable: "local variable 'return_code' referenced before assignment"
### Apache Airflow version
2.2.1
### Operating System
Ubuntu 20.04 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.3.0
apache-airflow-providers-apache-cassandra==2.1.0
apache-ai... | @a-pertsev maybe you woudl like to make a PR fixing it ?
@potiuk unfortunately i had some problems with using breeze to run tests (
I will make one more try later, maybe
From the traceback it looks like this could be solved just by moving `return_code = 1` before `self.log.exception(...)` (?)
yeap
@a-pertsev I could... | 2021-12-01T14:17:46Z | [] | [] |
Traceback (most recent call last):
File "/var/lib/airflow/.venv/lib/python3.8/site-packages/airflow/executors/celery_executor.py", line 121, in _execute_in_fork
args.func(args)
File "/var/lib/airflow/.venv/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **... | 2,463 | |||
apache/airflow | apache__airflow-20737 | ca4d2a3d088c911861748f281d3009f9b2167591 | diff --git a/airflow/cli/commands/task_command.py b/airflow/cli/commands/task_command.py
--- a/airflow/cli/commands/task_command.py
+++ b/airflow/cli/commands/task_command.py
@@ -16,24 +16,27 @@
# specific language governing permissions and limitations
# under the License.
"""Task sub-commands"""
+import datetime
i... | DagRun for <FOO> with run_id or execution_date of 'manual__XXXXX' not found
### Apache Airflow version
2.2.2 (latest released)
### What happened
After upgrading from Airflow 2.1.4 to 2.2.2, every DAG gives this error upon execution:
> [2021-12-17, 15:01:12 UTC] {taskinstance.py:1259} INFO - Executing <Task(_Pytho... | cc @uranusjr Can you take a look when you have time, please? | 2022-01-07T04:43:23Z | [] | [] |
Traceback (most recent call last):
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/task/task_runner/standard_task_runner.py", line 85, in _start_by_fork
args.func(args, dag=self.dag)
File "/home/ec2-user/venv/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
retu... | 2,477 | |||
apache/airflow | apache__airflow-20902 | c59001d79facf7e472e0581ac8a538c25eebfda7 | diff --git a/airflow/migrations/versions/e655c0453f75_add_taskmap_and_map_id_on_taskinstance.py b/airflow/migrations/versions/e655c0453f75_add_taskmap_and_map_id_on_taskinstance.py
--- a/airflow/migrations/versions/e655c0453f75_add_taskmap_and_map_id_on_taskinstance.py
+++ b/airflow/migrations/versions/e655c0453f75_add... | Airflow database upgrade fails with "psycopg2.errors.NotNullViolation: column "map_index" of relation "task_instance" contains null value"s
### Apache Airflow version
main (development)
### What happened
I currently have Airflow 2.2.3 and due to this [issue](https://github.com/apache/airflow/issues/19699) I have tri... | Thanks for opening your first issue here! Be sure to follow the issue template!
Map Index hasn't been introduced in Airflow 2.2.3 and nor intended to, it is currently available in the main branch and will be released in Airflow 2.3. So it is very likely you somehow used some code from the main branch.
Check https:/... | 2022-01-17T10:51:24Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement,... | 2,480 | |||
apache/airflow | apache__airflow-21116 | dff536e9409c6fe885aa61402772b946e33dda08 | diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -112,7 +112,7 @@
from airflow.utils.retries import run_with_db_retries
from airflow.utils.session import NEW_SESSION, create_session, provide_session
from airflow.... | Running airflow dags backfill --reset-dagruns <dag_id> -s <execution_start_dt> -e <execution_end_dt> results in error when run twice.
### Apache Airflow version
2.2.3 (latest released)
### What happened
It's the same situation as https://github.com/apache/airflow/issues/21023.
Only change to `airflow dags backfil... | Feel free to fix ! If you are fast, we can even cherry-pick to 2.2.4 :)
We should also add a check in `clear_task_instances` to catch this mistake, otherwise future regressions are bound to happen.
> We should also add a check in `clear_task_instances` to catch this mistake, otherwise future regressions are bound to ha... | 2022-01-26T07:42:40Z | [] | [] |
Traceback (most recent call last):
File "/home1/www/venv3/airflow/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/__main__.py", line 48, in main
args.func(args)
File "/home1/www/venv3/airflow/lib/python3.7/site-packages/airflow/cl... | 2,483 | |||
apache/airflow | apache__airflow-21289 | dc3c47dacd2a7058358cc5874b0064a064d4c51e | diff --git a/airflow/providers/elasticsearch/log/es_task_handler.py b/airflow/providers/elasticsearch/log/es_task_handler.py
--- a/airflow/providers/elasticsearch/log/es_task_handler.py
+++ b/airflow/providers/elasticsearch/log/es_task_handler.py
@@ -103,15 +103,25 @@ def __init__(
self.handler: Union[logging.... | Elasticsearch remote log will not fetch task logs from manual dagruns before 2.2 upgrade
### Apache Airflow Provider(s)
elasticsearch
### Versions of Apache Airflow Providers
```
apache-airflow-providers-amazon==1!2.5.0
apache-airflow-providers-cncf-kubernetes==1!2.1.0
apache-airflow-providers-datadog==1!2.... | All credit to @wolfier for finding this bug.
Seems like the problem in the code is that [this section](https://github.com/apache/airflow/blob/main/airflow/providers/elasticsearch/log/es_task_handler.py#L109-L114) requires resolving the `data_interval_start` and `data_interval_end` even if the format you specify does no... | 2022-02-03T10:11:25Z | [] | [] |
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/airflow/utils/log/file_task_handler.py", line 239, in read
log, metadata = self._read(task_instance, try_number_element, metadata)
File "/usr/local/lib/python3.9/site-packages/airflow/provi... | 2,486 | |||
apache/airflow | apache__airflow-21307 | 2c5f636e5cfac7cc246d6ed93660bf0f8e968982 | diff --git a/airflow/providers/google/cloud/transfers/postgres_to_gcs.py b/airflow/providers/google/cloud/transfers/postgres_to_gcs.py
--- a/airflow/providers/google/cloud/transfers/postgres_to_gcs.py
+++ b/airflow/providers/google/cloud/transfers/postgres_to_gcs.py
@@ -59,7 +59,8 @@ def description(self):
"""... | PostgresToGCSOperator fail on empty table and use_server_side_cursor=True
### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==6.1.0
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Oth... | Thanks for opening your first issue here! Be sure to follow the issue template!
Maybe you would like to contribute a PR to fix it @PikaYellow35 - we have > 1800 contributors, you can become one of them :)
Hi @potiuk - I am new to Airflow and would like to try and take a pass at this.
I have breeze setup; have crea... | 2022-02-03T21:12:47Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_tas... | 2,487 | |||
apache/airflow | apache__airflow-22685 | 2cf1ae30538e109627417e8f0c1650addac3311b | diff --git a/airflow/dag_processing/manager.py b/airflow/dag_processing/manager.py
--- a/airflow/dag_processing/manager.py
+++ b/airflow/dag_processing/manager.py
@@ -1065,6 +1065,7 @@ def prepare_file_path_queue(self):
def _kill_timed_out_processors(self):
"""Kill any file processors that timeout to defe... | dag_processing code needs to handle OSError("handle is closed") in poll() and recv() calls
### Apache Airflow version
2.1.4
### What happened
The problem also exists in the latest version of the Airflow code, but I experienced it in 2.1.4.
This is the root cause of problems experienced in [issue#13542](https://gi... | Thanks for opening your first issue here! Be sure to follow the issue template!
Feel free to submit a pull request to handle the exception! We can figure out how to test the solution in the review process.
BTW I don’t know what your current fix looks like, but `OSError` has an `errno` attribute, checking that in th... | 2022-04-01T10:45:10Z | [] | [] |
Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/opt/python3.8/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/opt/python3.8/lib/python3.8/site-pac... | 2,505 | |||
apache/airflow | apache__airflow-23053 | c3d883a971a8e4e65ccc774891928daaaa0f4442 | diff --git a/airflow/jobs/backfill_job.py b/airflow/jobs/backfill_job.py
--- a/airflow/jobs/backfill_job.py
+++ b/airflow/jobs/backfill_job.py
@@ -266,7 +266,7 @@ def _manage_executor_state(
if ti.state not in self.STATES_COUNT_AS_RUNNING:
# Don't use ti.task; if this task is mapped, that ... | A task's returned object should not be checked for mappability if the dag doesn't use it in an expansion.
### Apache Airflow version
main (development)
### What happened
Here's a dag:
```python3
with DAG(...) as dag:
@dag.task
def foo():
return "foo"
@dag.task
def identity(thing)... | Marking this to 2.3.0, but we _might_ push it to 2.3.1 | 2022-04-18T08:29:58Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1417, in _run_raw_task
self._execute_task_with_callbacks(context, test_mode)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1564, in _execute_task_with_ca... | 2,516 | |||
apache/airflow | apache__airflow-23119 | 70eede5dd6924a4eb74b7600cce2c627e51a3b7e | diff --git a/airflow/dag_processing/processor.py b/airflow/dag_processing/processor.py
--- a/airflow/dag_processing/processor.py
+++ b/airflow/dag_processing/processor.py
@@ -604,7 +604,7 @@ def _execute_task_callbacks(self, dagbag: DagBag, request: TaskCallbackRequest):
if simple_ti.task_id in dag.task_id... | Mapped KubernetesPodOperator "fails" but UI shows it is as still running
### Apache Airflow version
2.3.0b1 (pre-release)
### What happened
This dag has a problem. The `name` kwarg is missing from one of the mapped instances.
```python3
from datetime import datetime
from airflow import DAG
from airflow.provid... | Yeah, two things we can do here:
1. Improve the validation for KPO when mapped (framework is in place already)
2. (This one is most important) Find out why the task isn't being detected as failed!
Curious, something is up with the zombie detection too:
```[2022-04-20 08:42:36,843] 331186 MainProcess {{airflow.... | 2022-04-20T14:03:34Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1440, in _run_raw_task
self._execute_task_with_callbacks(context, test_mode)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1544, in _execute_task_with_ca... | 2,520 | |||
apache/airflow | apache__airflow-23463 | 35620edd4b5b108adf355855e03224a08d132b10 | diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py
--- a/airflow/decorators/base.py
+++ b/airflow/decorators/base.py
@@ -312,6 +312,9 @@ def _validate_arg_names(self, func: ValidationSource, kwargs: Dict[str, Any]):
raise TypeError(f"{func}() got unexpected keyword arguments {names}")
... | Empty `expand()` crashes the scheduler
### Apache Airflow version
2.3.0 (latest released)
### What happened
I've found a DAG that will crash the scheduler:
```
@task
def hello():
return "hello"
hello.expand()
```
```
[2022-05-03 03:41:23,779] {scheduler_job.py:753} ERROR - Exception whe... | We should make this a parse-time error because expanding nothing makes no sense anyway. If we really want this to work (which I assume should just be expanding to one task), this can be easily amendable by adding a `1` to the end of the currently crashing `reduce` call. | 2022-05-03T22:41:24Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 736, in _execute
self._run_scheduler_loop() ... | 2,526 | |||
apache/airflow | apache__airflow-24865 | f54782af8888065b464a44a6ea194a4fbb15b296 | diff --git a/airflow/models/baseoperator.py b/airflow/models/baseoperator.py
--- a/airflow/models/baseoperator.py
+++ b/airflow/models/baseoperator.py
@@ -1153,11 +1153,10 @@ def on_kill(self) -> None:
"""
def __deepcopy__(self, memo):
- """
- Hack sorting double chained task lists by task... | mini-scheduler raises AttributeError: 'NoneType' object has no attribute 'keys'
### Apache Airflow version
2.3.2 (latest released)
### What happened
The mini-scheduler run after a task finishes sometimes fails with an error "AttributeError: 'NoneType' object has no attribute 'keys'"; see full traceback below.
### W... | Thanks for opening your first issue here! Be sure to follow the issue template!
Isn't that the same class of problem as with #23838 @pingzh @ashb @uranusjr ?
I don’t think it’s the same. The final error looks similar, but this one is triggered by `task_dict`, which is on the DAG object, not Operator. The DAG structure... | 2022-07-06T09:28:36Z | [] | [] |
Traceback (most recent call last):
File "/app/airflow-bug-minimal.py", line 22, in <module>
dag.partial_subset("tg1.task1")
File "/venv/lib/python3.10/site-packages/airflow/models/dag.py", line 2013, in partial_subset
dag.task_dict = {
File "/venv/lib/python3.10/site-packages/airflow/models/dag.py",... | 2,550 | |||
apache/airflow | apache__airflow-24943 | abb034113540b708e87379665a1b5caadb8748bc | diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -2296,8 +2296,13 @@ def get_email_subject_content(
def render(key: str, content: str) -> str:
if conf.has_option('email', key):
... | Send default email if file "html_content_template" not found
### Apache Airflow version
2.3.2 (latest released)
### What happened
I created a new email template to be sent when there are task failures. I accidentally added the path to the `[email] html_content_template` and `[email] subject_template` with a typo and... | You seem to know how to fix it, would you like to make a PR fixing it? Otherwise it will have to wait for someone who would likel to pick it.
Hi @potiuk
I have never created a PR on such a complex project. However, since this seems easy to fix, it should prove useful to learn! I'll create the PR during the followin... | 2022-07-09T21:22:00Z | [] | [] |
Traceback (most recent call last):
File "/home/user/.conda/envs/airflow/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1942, in handle_failure
self.email_alert(error, task)
File "/home/user/.conda/envs/airflow/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2323, in emai... | 2,551 | |||
apache/airflow | apache__airflow-25312 | 741c20770230c83a95f74fe7ad7cc9f95329f2cc | diff --git a/airflow/models/taskinstance.py b/airflow/models/taskinstance.py
--- a/airflow/models/taskinstance.py
+++ b/airflow/models/taskinstance.py
@@ -287,6 +287,7 @@ def clear_task_instances(
if dag_run_state == DagRunState.QUEUED:
dr.last_scheduling_decision = None
d... | Scheduler crashes with psycopg2.errors.DeadlockDetected exception
### Apache Airflow version
2.2.5 (latest released)
### What happened
Customer has a dag that generates around 2500 tasks dynamically using a task group. While running the dag, a subset of the tasks (~1000) run successfully with no issue and (~1500) of... | Thanks for opening your first issue here! Be sure to follow the issue template!
I faced the same issue with [airflow 2.3.0rc2](https://github.com/apache/airflow/tree/2.3.0rc2)
Had a basic dag added.
```py
from datetime import datetime
from airflow import DAG
from airflow.decorators import task
with DAG(dag_i... | 2022-07-26T15:57:55Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1256, in _execute_context
self.dialect.do_executemany(
File "/usr/local/lib/python3.9/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 912, in do_executemany
cursor.executeman... | 2,556 | |||
apache/airflow | apache__airflow-25355 | f6b48ac6dfaf931a5433ec16369302f68f038c65 | diff --git a/airflow/decorators/base.py b/airflow/decorators/base.py
--- a/airflow/decorators/base.py
+++ b/airflow/decorators/base.py
@@ -455,7 +455,7 @@ def _expand_mapped_kwargs(self, resolve: Optional[Tuple[Context, Session]]) -> D
assert self.expand_input is EXPAND_INPUT_EMPTY
return {"op_kwargs"... | expand_kwargs.map(func) gives unhelpful error message if func returns list
### Apache Airflow version
main (development)
### What happened
Here's a DAG:
```python3
with DAG(
dag_id="expand_list",
doc_md="try to get kwargs from a list",
schedule_interval=None,
start_date=datetime(2001,... | 2022-07-28T06:02:28Z | [] | [] |
Traceback (most recent call last):
File "/home/matt/src/airflow/airflow/executors/debug_executor.py", line 78, in _run_task
ti.run(job_id=ti.job_id, **params)
File "/home/matt/src/airflow/airflow/utils/session.py", line 71, in wrapper
return func(*args, session=session, **kwargs)
File "/home/matt/sr... | 2,557 | ||||
apache/airflow | apache__airflow-25757 | dc738cde04d91084bf79b1a601395b7abd41d8ca | diff --git a/airflow/models/mappedoperator.py b/airflow/models/mappedoperator.py
--- a/airflow/models/mappedoperator.py
+++ b/airflow/models/mappedoperator.py
@@ -623,7 +623,17 @@ def expand_mapped_task(self, run_id: str, *, session: Session) -> Tuple[Sequence
from airflow.models.taskinstance import TaskInstan... | Backfill mode with mapped tasks: "Failed to populate all mapping metadata"
### Apache Airflow version
2.3.3
### What happened
I was backfilling some DAGs that use dynamic tasks when I got an exception like the following:
```
Traceback (most recent call last):
File "/opt/conda/envs/production/bin/airflow... | After #25661 this may cause a different error. I will need to look into this further (not now).
To clarify, the downstream (mapped task) will never run correctly in any scenarios, since if the upstream raises an exception, there’s nothing the task can be expanded into. But the scheduler should handle this more grace... | 2022-08-17T07:01:07Z | [] | [] |
Traceback (most recent call last):
File "/opt/conda/envs/production/bin/airflow", line 11, in <module>
sys.exit(main())
File "/opt/conda/envs/production/lib/python3.9/site-packages/airflow/__main__.py", line 38, in main
args.func(args)
File "/opt/conda/envs/production/lib/python3.9/site-packages/air... | 2,569 | |||
apache/airflow | apache__airflow-25793 | 648e224cd455f1e374c58cfa48eb1c0ed69c698d | diff --git a/airflow/models/abstractoperator.py b/airflow/models/abstractoperator.py
--- a/airflow/models/abstractoperator.py
+++ b/airflow/models/abstractoperator.py
@@ -26,6 +26,7 @@
Dict,
FrozenSet,
Iterable,
+ Iterator,
List,
Optional,
Sequence,
@@ -54,6 +55,7 @@
from airflow... | XComs from another task group fail to populate dynamic task mapping metadata
### Apache Airflow version
2.3.3
### What happened
When a task returns a mappable Xcom within a task group, the dynamic task mapping feature (via `.expand`) causes the Airflow Scheduler to infinitely loop with a runtime error:
```
... | Thanks for opening your first issue here! Be sure to follow the issue template!
| 2022-08-18T11:00:15Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/__main__.py", line 38, in main
args.func(args)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/... | 2,571 | |||
apache/airflow | apache__airflow-25821 | 0254f30a5a90f0c3104782525fabdcfdc6d3b7df | diff --git a/airflow/providers/common/sql/operators/sql.py b/airflow/providers/common/sql/operators/sql.py
--- a/airflow/providers/common/sql/operators/sql.py
+++ b/airflow/providers/common/sql/operators/sql.py
@@ -350,7 +350,7 @@ class SQLTableCheckOperator(BaseSQLOperator):
sql_check_template = """
SE... | SQLTableCheckOperator fails for Postgres
### Apache Airflow version
2.3.3
### What happened
`SQLTableCheckOperator` fails when used with Postgres.
### What you think should happen instead
From the logs:
```
[2022-08-19, 09:28:14 UTC] {taskinstance.py:1910} ERROR - Task failed with exception
Traceback ... | Thanks for opening your first issue here! Be sure to follow the issue template!
| 2022-08-19T11:12:33Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/providers/common/sql/operators/sql.py", line 296, in execute
records = hook.get_first(self.sql)
File "/usr/local/lib/python3.9/site-packages/airflow/hooks/dbapi.py", line 178, in get_first
cur.execute(sql)
psycopg2... | 2,573 | |||
apache/airflow | apache__airflow-25970 | 876536ea3c45d5f15fcfbe81eda3ee01a101faa3 | diff --git a/airflow/configuration.py b/airflow/configuration.py
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -1545,19 +1545,18 @@ def get_custom_secret_backend() -> Optional[BaseSecretsBackend]:
"""Get Secret Backend if defined in airflow.cfg"""
secrets_backend_cls = conf.getimport(sectio... | Unable to configure Google Secrets Manager in 2.3.4
### Apache Airflow version
2.3.4
### What happened
I am attempting to configure a Google Secrets Manager secrets backend using the `gcp_keyfile_dict` param in a `.env` file with the following ENV Vars:
```
AIRFLOW__SECRETS__BACKEND=airflow.providers.google.clou... | @pdebelak - I think this is caused by the LRU cache introduced in https://github.com/apache/airflow/pull/25556 - is it possible you take a look and see if it can be fixed/workarounded ?
I believe the problem is that dict-indeed is not hashable, and you can pass the dict as parameter of the secret backend configuration.... | 2022-08-25T23:24:46Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 5, in <module>
from airflow.__main__ import main
File "/usr/local/lib/python3.9/site-packages/airflow/__init__.py", line 35, in <module>
from airflow import settings
File "/usr/local/lib/python3.9/site-packages/airflow/settings.p... | 2,575 | |||
apache/airflow | apache__airflow-26369 | 5e9589c685bcec769041e0a1692035778869f718 | diff --git a/airflow/serialization/serialized_objects.py b/airflow/serialization/serialized_objects.py
--- a/airflow/serialization/serialized_objects.py
+++ b/airflow/serialization/serialized_objects.py
@@ -17,6 +17,7 @@
"""Serialized DAG and BaseOperator"""
from __future__ import annotations
+import collections.ab... | dynamic dataset ref breaks when viewed in UI or when triggered (dagbag.py:_add_dag_from_db)
### Apache Airflow version
2.4.0b1
### What happened
Here's a file which defines three dags. "source" uses `Operator.partial` to reference either "sink". I'm not sure if it's supported to do so, but airlflow should at... | 2022-09-13T15:30:47Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2525, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,584 | ||||
apache/airflow | apache__airflow-27609 | c20c3f01ca069e98e302a328fe45e3d750956d03 | diff --git a/airflow/callbacks/callback_requests.py b/airflow/callbacks/callback_requests.py
--- a/airflow/callbacks/callback_requests.py
+++ b/airflow/callbacks/callback_requests.py
@@ -84,17 +84,17 @@ def __init__(
self.is_failure_callback = is_failure_callback
def to_json(self) -> str:
- dict_... | Object of type V1Pod is not JSON serializable after detecting zombie jobs cause Scheduler CrashLoopBack
### Apache Airflow version
2.4.2
### What happened
Some dags have tasks with pod_override in executor_config become zombie tasks. Airflow Scheduler run and crash with exception:
```
[2022-11-10T15:29:59.... | Thanks for opening your first issue here! Be sure to follow the issue template!
Did you try `airflow dags reserialize` ? https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html#reserialize
Can you check if it fixes your problem?
Having same problem ` airflow dags reserialize` ain't he... | 2022-11-10T22:50:53Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 746, in _execute
self._run_scheduler_loop()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 878, in _run_scheduler_loop
next_event =... | 2,601 | |||
apache/airflow | apache__airflow-28003 | 527fbce462429fc9836837378f801eed4e9d194f | diff --git a/airflow/models/dag.py b/airflow/models/dag.py
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -1950,7 +1950,7 @@ def set_dag_runs_state(
@provide_session
def clear(
self,
- task_ids: Collection[str] | Collection[tuple[str, int]] | None = None,
+ task_ids: Collect... | Clearing dag run via UI fails on main branch and 2.5.0rc2
### Apache Airflow version
main (development)
### What happened
Create a simple dag, allow it to completely run through.
Next, when in grid view, on the left hand side click on the dag run at the top level.
On the right hand side, then click on "Clear... | Issue is that this fn is missing the `@provide_session` decorator and/or the `session` argument shouldn't have a default value.
```
def _clear_dag_tis(
self,
dag: DAG,
start_date: datetime | None,
end_date: datetime | None,
origin: str,
task_ids=None,
... | 2022-11-30T09:28:38Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2525, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1822, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,609 | |||
apache/airflow | apache__airflow-28191 | 84a5faff0de2a56f898b8a02aca578b235cb12ba | diff --git a/airflow/models/xcom.py b/airflow/models/xcom.py
--- a/airflow/models/xcom.py
+++ b/airflow/models/xcom.py
@@ -723,6 +723,21 @@ def __eq__(self, other: Any) -> bool:
return all(x == y for x, y in z)
return NotImplemented
+ def __getstate__(self) -> Any:
+ # We don't want to... | Dynamic task context fails to be pickled
### Apache Airflow version
2.5.0
### What happened
When I upgrade to 2.5.0, run dynamic task test failed.
```py
from airflow.decorators import task, dag
import pendulum as pl
@dag(
dag_id='test-dynamic-tasks',
schedule=None,
start_date=pl.today().... | 2022-12-07T13:23:20Z | [] | [] |
Traceback (most recent call last):
File "/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/decorators/base.py", line 217, in execute
return_value = super().execute(context)
File "/home/andi/airflow/venv38/lib/python3.8/site-packages/airflow/operators/python.py", line 356, in execute
return ... | 2,611 | ||||
apache/airflow | apache__airflow-28397 | fefcb1d567d8d605f7ec9b7d408831d656736541 | diff --git a/airflow/models/dag.py b/airflow/models/dag.py
--- a/airflow/models/dag.py
+++ b/airflow/models/dag.py
@@ -2549,7 +2549,7 @@ def create_dagrun(
external_trigger: bool | None = False,
conf: dict | None = None,
run_type: DagRunType | None = None,
- session=NEW_SESSION,
+ ... | Triggering a DAG with the same run_id as a scheduled one causes the scheduler to crash
### Apache Airflow version
2.5.0
### What happened
A user with access to manually triggering DAGs can trigger a DAG. provide a run_id that matches the pattern used when creating scheduled runs and cause the scheduler to cras... | Smth to take a look at 2.5.1
Code around this has actually changed quite a bit due to datasets. A reproduction on latest Airflow would be awesome.
Yep. @uranusjr is right. We are releasing 2.5.0 in coming days - can you please try to reproduce it when we do @saulbein ?
I reproduced the same bug with 2.4.3, will try wh... | 2022-12-16T06:38:41Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 759, in _execute
self._run_scheduler_loop()
File "/usr/local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 885, in _run_scheduler_loop
num_queued_tis = self._do_schedu... | 2,615 | |||
apache/airflow | apache__airflow-29518 | 1e7c064560b8504b45e3a53dc8f294b143b4ec7d | diff --git a/airflow/providers/google/cloud/operators/cloud_base.py b/airflow/providers/google/cloud/operators/cloud_base.py
--- a/airflow/providers/google/cloud/operators/cloud_base.py
+++ b/airflow/providers/google/cloud/operators/cloud_base.py
@@ -18,6 +18,8 @@
"""This module contains a Google API base operator."""... | KubernetesExecutor leaves failed pods due to deepcopy issue with Google providers
### Apache Airflow version
Other Airflow 2 version (please specify below)
### What happened
With Airflow 2.3 and 2.4 there appears to be a bug in the KubernetesExecutor when used in conjunction with the Google airflow providers. ... | Thanks for opening your first issue here! Be sure to follow the issue template!
This seems to be a limitation in Python until 3.10 (not sure when exactly this was fixed). I’m not quite sure if we want to (or even _can_) fix this, to be honest.
Ah, maybe this could be possible if we implement custom copying logic on op... | 2023-02-13T23:06:58Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/__main__.py", line 39, in main
args.func(args)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/cli/cli_parser... | 2,633 | |||
apache/airflow | apache__airflow-31033 | ae7e1dbc8a772e0b6d749f2347b3951369d906e7 | diff --git a/airflow/executors/base_executor.py b/airflow/executors/base_executor.py
--- a/airflow/executors/base_executor.py
+++ b/airflow/executors/base_executor.py
@@ -38,7 +38,8 @@
if TYPE_CHECKING:
from airflow.callbacks.base_callback_sink import BaseCallbackSink
from airflow.callbacks.callback_requests... | DB migration job fails with circular import
### Apache Airflow version
2.6.0
### What happened
I upgraded my Airflow 2.5.3 to 2.6.0 using the official Helm chart 1.9.0 installation on a Kubernetes cluster. The DB migration job fails with a circular import of "TaskInstanceKey". The image I'm using is `apache/ai... | Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.
The loop is `DagRun -> TaskInstance -> Sentry -> KubernetesExecutor -> kubernetes_helper ->TaskInstanceKey`. I guess this happened because I e... | 2023-05-03T06:43:10Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
| 2,643 | |||
apache/airflow | apache__airflow-8165 | 6c273466d598d7bcfb7c21feafcccb07cc4230fb | diff --git a/airflow/www/utils.py b/airflow/www/utils.py
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -406,8 +406,8 @@ class CustomSQLAInterface(SQLAInterface):
'_' from the key to lookup the column names.
"""
- def __init__(self, obj):
- super().__init__(obj)
+ def __init__(self, o... | Security Views broken
Thanks, @KostyaEsmukov for finding this bug. I was able to replicate this bug.
**Apache Airflow version**: 1.10.10.rc3
**Environment**:
- **OS** (e.g. from /etc/os-release): MacOs
- **Kernel** (e.g. `uname -a`): `Darwin MacBook-Pro.local 19.3.0 Darwin Kernel Version 19.3.0: Thu Jan ... | Wonder if this is related to my DT changes where I had to hack around FABs SQLA models
> Wonder if this is related to my DT changes where I had to hack around FABs SQLA models
Yeah, very likely
Verified this error occurred because of a bug in f8f0bec38 | 2020-04-06T16:41:29Z | [] | [] |
Traceback (most recent call last):
File "/Users/kaxilnaik/.virtualenvs/airflow_test_upgrade/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/Users/kaxilnaik/.virtualenvs/airflow_test_upgrade/lib/python3.7/site-packages/flask/app.py", line 1952,... | 2,682 | |||
apache/airflow | apache__airflow-8230 | 55d379c71ffd3b765d446fb12a339114f3b0c14f | diff --git a/airflow/models/chart.py b/airflow/models/chart.py
--- a/airflow/models/chart.py
+++ b/airflow/models/chart.py
@@ -21,6 +21,7 @@
from sqlalchemy.orm import relationship
from airflow.models.base import Base, ID_LEN
+from airflow.models.user import User
from airflow.utils.sqlalchemy import UtcDateTime
f... | Airflow webserver not starting with SQLAlchemy==1.3.16
**Apache Airflow version**: 1.10.9
**Environment**: Ubuntu 18.04 LTS
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):Ubuntu 18.04 LTS
**What happened**: airflow webserver error
airflow@airflow:~$ airflow webserver
... | Thanks for opening your first issue here! Be sure to follow the issue template!
Actually, I got the same error with the SequentialExecutor as well
The same problem here, I'm having `apache-airflow==1.10.9` on Ubuntu 18.04, I tried to downgrade to `1.10.8`, but the problem remain.
I run `airflow==1.10.7` it seems the i... | 2020-04-09T11:57:17Z | [] | [] |
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 37, in <module>
args.func(args)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 75, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/... | 2,684 | |||
apache/airflow | apache__airflow-8512 | 57c8c05839f66ed2909b1bee8ff6976432db82aa | diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -598,6 +598,7 @@ def write_version(filename: str = os.path.join(*[my_dir, "airflow", "git_version
'tzlocal>=1.4,<2.0.0',
'unicodecsv>=0.14.1',
'werkzeug<1.0.0',
+ 'WTforms<2.3.0', # TODO: Remove after https://github.com/dpgaspar/Flask-... | WTFroms new release 2.3.0 breaks airflow 1.10.10
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks... | Thanks for opening your first issue here! Be sure to follow the issue template!
FYI. if you want to install airflow in repeatable way, as of airlfow 1.10.10 you have the way to install airflow repeatably no matter if there were some breaking packages:
I will make it more prominent in README/INSTALL now that we hav... | 2020-04-22T09:59:56Z | [] | [] |
Traceback (most recent call last):
File "/home/sgu/miniconda3/envs/tmp/bin/airflow", line 26, in <module>
from airflow.bin.cli import CLIFactory
File "/home/sgu/miniconda3/envs/tmp/lib/python3.6/site-packages/airflow/bin/cli.py", line 71, in <module>
from airflow.www_rbac.app import cached_app as cache... | 2,688 | |||
apache/airflow | apache__airflow-8671 | c717d12f47c604082afc106b7a4a1f71d91f73e2 | diff --git a/airflow/configuration.py b/airflow/configuration.py
--- a/airflow/configuration.py
+++ b/airflow/configuration.py
@@ -18,6 +18,7 @@
import copy
import logging
+import multiprocessing
import os
import pathlib
import re
@@ -180,12 +181,8 @@ def __init__(self, default_config=None, *args, **kwargs):
... | [AIRFLOW-6529] Pickle error occurs when the scheduler tries to run on macOS.
When we try to run the scheduler on macOS, we will get a serialization error like as follows.
```
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___... | 2020-05-01T15:58:00Z | [] | [] |
Traceback (most recent call last):
File "/Users/sarutak/work/oss/airflow-env/master-python3.8.1/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py", line 1498, in _execute
self._execute_helper()
File "/Users/sarutak/work/oss/airflow-env/master-python3.8.1/lib/python3.8/site-packages/airflow/jobs/sche... | 2,694 | ||||
apache/airflow | apache__airflow-8787 | 2bd3e760deaed9c59508a90718d08cdf90cd928f | diff --git a/airflow/providers/apache/spark/example_dags/example_spark_dag.py b/airflow/providers/apache/spark/example_dags/example_spark_dag.py
--- a/airflow/providers/apache/spark/example_dags/example_spark_dag.py
+++ b/airflow/providers/apache/spark/example_dags/example_spark_dag.py
@@ -48,7 +48,6 @@
jdbc_to_sp... | Spark JDBC Hook fails if spark_conf is not specified
**Apache Airflow version**: 1.10.10
**What happened**:
At SparkJDBCHook, the `spark_conf` parameter has default None, if kept like that it raise an error:
```
Traceback (most recent call last):
File "/Users/rbottega/Documents/airflow_latest/env/lib/python3... | 2020-05-08T12:30:25Z | [] | [] |
Traceback (most recent call last):
File "/Users/rbottega/Documents/airflow_latest/env/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 983, in _run_raw_task
result = task_copy.execute(context=context)
File "/Users/rbottega/Documents/airflow_latest/env/lib/python3.7/site-packages/airflow/con... | 2,696 | ||||
apache/airflow | apache__airflow-9779 | 1de78e8f97f48f8f4abd167a0120ffab8af6127a | diff --git a/airflow/www/utils.py b/airflow/www/utils.py
--- a/airflow/www/utils.py
+++ b/airflow/www/utils.py
@@ -321,8 +321,8 @@ def wrapped_markdown(s, css_class=None):
return None
return Markup(
- '<div class="rich_doc {css_class}" >' + markdown.markdown(s) + "</div>"
- ).format(css_class=... | JSON notation in Airflow DAG comments causing KeyError
-->
**Apache Airflow version**: 1.10.11 (code is working fine in 1.10.10)
**Environment**: Python 3.7
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Catalina 10.15.5
- **Kernel** (e.g. `uname -a`):
- **Install tool... | @prakshalj0512 what version of flask do you use?
```
Flask 1.1.2
Flask-Admin 1.5.4
Flask-AppBuilder 2.3.0
Flask-Babel 1.0.0
Flask-Caching 1.3.3
Flask-JWT-Extended 3.24.1
Flask-Login 0.4.1
Flask-OpenID 1.2.5
Flask-RESTful 0.... | 2020-07-12T13:06:57Z | [] | [] |
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/... | 2,724 | |||
celery/celery | celery__celery-1206 | 707fdb39d057bfc5179626113fb77aeca56a6ac6 | diff --git a/celery/backends/mongodb.py b/celery/backends/mongodb.py
--- a/celery/backends/mongodb.py
+++ b/celery/backends/mongodb.py
@@ -96,7 +96,10 @@ def _get_connection(self):
# This enables the use of replica sets and sharding.
# See pymongo.Connection() for more info.
args ... | MongoDB and BROKER_USE_SSL=True
I've recently started with mongodb and BROKER_USE_SSL=True, this doesn't seem to work. Celery is trying to reconnect with Re-establishing connection message. BROKER_USE_SSL=False works well.
``` python
[2013-02-21 14:57:45,708: DEBUG/MainProcess] consumer: Re-establishing connection to... | 2013-02-23T09:59:42Z | [] | [] |
Traceback (most recent call last):
File "/stuff/eggs/celery-3.0.13-py2.7.egg/celery/worker/consumer.py", line 392, in start
self.reset_connection()
File "/stuff/eggs/celery-3.0.13-py2.7.egg/celery/worker/consumer.py", line 741, in reset_connection
self.connection, on_decode_error=self.on_decode_error,
Fi... | 2,733 | ||||
celery/celery | celery__celery-1769 | 3c4860d2208ae07fc1f5f07d7e9ae6c79919e9c4 | diff --git a/celery/apps/worker.py b/celery/apps/worker.py
--- a/celery/apps/worker.py
+++ b/celery/apps/worker.py
@@ -315,6 +315,9 @@ def on_SIGINT(worker):
def _reload_current_worker():
+ platforms.close_open_fds([
+ sys.__stdin__, sys.__stdout__, sys.__stderr__,
+ ])
os.execv(sys.executable, [... | Sending SIGHUP leaks file handles
When sending SIGHUP to the Celery master process, it leaks all of its previously open file handles when calling exec. This is a regression introduced in 3.1 by 118b300fcad4e6ffb0178fc00cf9fe26075101a5 (originally fixed in 803655b79ccb0403f47cfcd2cfa5a6ed66301cbc for #1270).
This addi... | 2014-01-03T23:56:52Z | [] | [] |
Traceback (most recent call last):
File "celery/bootsteps.py", line 155, in send_all
fun(parent, *args)
File "celery/bootsteps.py", line 377, in stop
return self.obj.stop()
File "celery/concurrency/base.py", line 119, in stop
self.on_stop()
File "celery/concurrency/prefork.py", line 140, in on_stop... | 2,734 | ||||
celery/celery | celery__celery-1834 | 59e44ae6300e5b39b3306bc2cdc76a0b85b3d418 | diff --git a/celery/concurrency/asynpool.py b/celery/concurrency/asynpool.py
--- a/celery/concurrency/asynpool.py
+++ b/celery/concurrency/asynpool.py
@@ -501,7 +501,7 @@ def verify_process_alive(proc):
if proc._is_alive() and proc in waiting_to_start:
assert proc.outqR_fd in fileno_to_out... | "assert proc.outqR_fd in hub.readers" AssertionError
```
[2014-01-13 15:06:53,047] pid=33970/MainProcess - ERROR - celery.worker - Unrecoverable error: AssertionError()
Traceback (most recent call last):
File "/home/ionel/projects/core/.ve/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in ... | Tested with:
- celery/kombu@ffa90945bf06ba8b9269b4a36019baad0ac57793
- celery/billiard@c29c4f7adbd0f7f4544c05fb9777800616e89d2f
- celery/celery@ceaf7aba36eae78af852eb5ca703c81091b52f23
[I too](https://github.com/celery/kombu/issues/305) am getting a similar issue. Which transport are you using?
It was redis. I will p... | 2014-01-30T13:06:46Z | [] | [] |
Traceback (most recent call last):
File "/home/ionel/projects/core/.ve/local/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in start
self.blueprint.start(self)
File "/home/ionel/projects/core/.ve/local/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
step.start(parent... | 2,735 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.