Skip to content

Instantly share code, notes, and snippets.

@rafaelcascalho
Last active September 13, 2019 21:05
Show Gist options
  • Save rafaelcascalho/fe015dd80f0cae659df3a16b260e55fd to your computer and use it in GitHub Desktop.
Save rafaelcascalho/fe015dd80f0cae659df3a16b260e55fd to your computer and use it in GitHub Desktop.
I got this error after setting up all datasets but the mesp ones. Is it really an issue related to that?
collected 37 items
tests/test_classification_metric.py .... [ 10%]
tests/test_demo_adversarial_debiasing.py F [ 13%]
tests/test_demo_calibrated_eqodds_postprocessing.py F [ 16%]
tests/test_demo_lfr.py F [ 18%]
tests/test_demo_optim_data_preproc.py F [ 21%]
tests/test_demo_reject_option_classification.py F [ 24%]
tests/test_demo_reweighing_preproc.py F [ 27%]
tests/test_disparate_impact_remover.py FF [ 32%]
tests/test_lfr.py ....... [ 51%]
tests/test_meta_classifier.py . [ 54%]
tests/test_reweighing.py . [ 56%]
tests/test_sample_distortion_metric.py ...... [ 72%]
tests/test_standard_datasets.py ..... [ 86%]
tests/test_structured_dataset.py ..... [100%]
=========================================================================== FAILURES ===========================================================================
_______________________________________________________________ test_demo_adversarial_debiasing ________________________________________________________________
def test_demo_adversarial_debiasing():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_adversarial_debiasing.ipynb'))
../tests/test_demo_adversarial_debiasing.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_adversarial_debiasing.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 31479 bytes to /tmp/tmpnf6s8cj9.ipynb
____________________________________________________________ test_calibrated_eqodds_postprocessing _____________________________________________________________
def test_calibrated_eqodds_postprocessing():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_calibrated_eqodds_postprocessing.ipynb'))
../tests/test_demo_calibrated_eqodds_postprocessing.py:10:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_calibrated_eqodds_postprocessing.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 69824 bytes to /tmp/tmp2z2hztr7.ipynb
________________________________________________________________________ test_demo_lfr _________________________________________________________________________
def test_demo_lfr():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_lfr.ipynb'))
../tests/test_demo_lfr.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------
RUNNING THE L-BFGS-B CODE
* * *
Machine precision = 2.220D-16
N = 131 M = 10
At X0 0 variables are exactly at the bounds
At iterate 0 f= 2.08403D+04 |proj g|= 2.34747D+03
At iterate 1 f= 1.92268D+04 |proj g|= 2.26813D+03
At iterate 2 f= 1.87377D+04 |proj g|= 8.24798D+02
At iterate 3 f= 1.83557D+04 |proj g|= 1.06841D+03
At iterate 4 f= 1.68390D+04 |proj g|= 4.86068D+02
At iterate 5 f= 1.64426D+04 |proj g|= 1.25189D+03
At iterate 6 f= 1.63187D+04 |proj g|= 6.35896D+02
At iterate 7 f= 1.62482D+04 |proj g|= 3.71498D+02
At iterate 8 f= 1.60510D+04 |proj g|= 1.07705D+03
At iterate 9 f= 1.60067D+04 |proj g|= 9.86886D+02
At iterate 10 f= 1.59015D+04 |proj g|= 9.63617D+02
At iterate 11 f= 1.57740D+04 |proj g|= 2.59482D+02
At iterate 12 f= 1.57225D+04 |proj g|= 3.02092D+02
At iterate 13 f= 1.55402D+04 |proj g|= 9.75247D+02
At iterate 14 f= 1.54094D+04 |proj g|= 3.32286D+02
At iterate 15 f= 1.53602D+04 |proj g|= 3.75996D+02
At iterate 16 f= 1.53134D+04 |proj g|= 4.32137D+02
At iterate 17 f= 1.52420D+04 |proj g|= 3.38511D+02
At iterate 18 f= 1.51964D+04 |proj g|= 1.44210D+02
At iterate 19 f= 1.51835D+04 |proj g|= 8.54599D+01
At iterate 20 f= 1.51658D+04 |proj g|= 1.33628D+02
At iterate 21 f= 1.51340D+04 |proj g|= 2.38552D+02
At iterate 22 f= 1.51055D+04 |proj g|= 3.40586D+02
At iterate 23 f= 1.50862D+04 |proj g|= 2.09937D+02
At iterate 24 f= 1.50819D+04 |proj g|= 8.90523D+01
At iterate 25 f= 1.50802D+04 |proj g|= 1.01242D+02
At iterate 26 f= 1.50761D+04 |proj g|= 1.11889D+02
At iterate 27 f= 1.50663D+04 |proj g|= 1.50153D+02
At iterate 28 f= 1.50551D+04 |proj g|= 8.58364D+01
At iterate 29 f= 1.50504D+04 |proj g|= 9.95440D+01
At iterate 30 f= 1.50470D+04 |proj g|= 5.58267D+01
At iterate 31 f= 1.50438D+04 |proj g|= 4.11067D+01
At iterate 32 f= 1.50386D+04 |proj g|= 5.29101D+01
At iterate 33 f= 1.50296D+04 |proj g|= 6.60456D+01
At iterate 34 f= 1.50184D+04 |proj g|= 7.37458D+01
At iterate 35 f= 1.50111D+04 |proj g|= 3.21202D+02
At iterate 36 f= 1.50027D+04 |proj g|= 1.00139D+02
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
131 36 38 45 0 4 1.001D+02 1.500D+04
F = 15002.718090290677
STOP: TOTAL NO. of f AND g EVALUATIONS EXCEEDS LIMIT
Cauchy time 0.000E+00 seconds.
Subspace minimization time 0.000E+00 seconds.
Line search time 0.000E+00 seconds.
Total User time 0.000E+00 seconds.
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_lfr.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 32291 bytes to /tmp/tmp6g2pv_y_.ipynb
______________________________________________________________________ test_optim_preproc ______________________________________________________________________
def test_optim_preproc():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_optim_data_preproc.ipynb'))
../tests/test_demo_optim_data_preproc.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_optim_data_preproc.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 56179 bytes to /tmp/tmpuatkjc6p.ipynb
______________________________________________________________ test_reject_option_classification _______________________________________________________________
def test_reject_option_classification():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_reject_option_classification.ipynb'))
../tests/test_demo_reject_option_classification.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_reject_option_classification.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 44535 bytes to /tmp/tmpi92b8pqo.ipynb
___________________________________________________________________ test_reweighing_preproc ____________________________________________________________________
def test_reweighing_preproc():
nb, errors = notebook_run(os.path.join(
os.path.dirname(os.path.abspath(__file__)),
> '..', 'examples', 'demo_reweighing_preproc.ipynb'))
../tests/test_demo_reweighing_preproc.py:11:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../tests/notebook_runner.py:33: in notebook_run
nb = nbformat.read(fout, nbformat.current_nbformat)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:141: in read
return reads(fp.read(), as_version, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/__init__.py:74: in reads
nb = reader.reads(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:58: in reads
nb_dict = parse_json(s, **kwargs)
../../../../miniconda3/envs/aiflearn/lib/python3.5/site-packages/nbformat/reader.py:14: in parse_json
nb_dict = json.loads(s, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
s = b'{\n "cells": [\n {\n "cell_type": "markdown",\n "metadata": {},\n "source": [\n "#### This notebook demons...ython",\n "pygments_lexer": "ipython3",\n "version": "3.5.6"\n }\n },\n "nbformat": 4,\n "nbformat_minor": 2\n}\n'
encoding = None, cls = None, object_hook = None, parse_float = None, parse_int = None, parse_constant = None, object_pairs_hook = None, kw = {}
def loads(s, encoding=None, cls=None, object_hook=None, parse_float=None,
parse_int=None, parse_constant=None, object_pairs_hook=None, **kw):
"""Deserialize ``s`` (a ``str`` instance containing a JSON
document) to a Python object.
``object_hook`` is an optional function that will be called with the
result of any object literal decode (a ``dict``). The return value of
``object_hook`` will be used instead of the ``dict``. This feature
can be used to implement custom decoders (e.g. JSON-RPC class hinting).
``object_pairs_hook`` is an optional function that will be called with the
result of any object literal decoded with an ordered list of pairs. The
return value of ``object_pairs_hook`` will be used instead of the ``dict``.
This feature can be used to implement custom decoders that rely on the
order that the key and value pairs are decoded (for example,
collections.OrderedDict will remember the order of insertion). If
``object_hook`` is also defined, the ``object_pairs_hook`` takes priority.
``parse_float``, if specified, will be called with the string
of every JSON float to be decoded. By default this is equivalent to
float(num_str). This can be used to use another datatype or parser
for JSON floats (e.g. decimal.Decimal).
``parse_int``, if specified, will be called with the string
of every JSON int to be decoded. By default this is equivalent to
int(num_str). This can be used to use another datatype or parser
for JSON integers (e.g. float).
``parse_constant``, if specified, will be called with one of the
following strings: -Infinity, Infinity, NaN.
This can be used to raise an exception if invalid JSON numbers
are encountered.
To use a custom ``JSONDecoder`` subclass, specify it with the ``cls``
kwarg; otherwise ``JSONDecoder`` is used.
The ``encoding`` argument is ignored and deprecated.
"""
if not isinstance(s, str):
raise TypeError('the JSON object must be str, not {!r}'.format(
> s.__class__.__name__))
E TypeError: the JSON object must be str, not 'bytes'
../../../../miniconda3/envs/aiflearn/lib/python3.5/json/__init__.py:312: TypeError
--------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------
[NbConvertApp] Converting notebook /home/codeminer42/Code/open-source/aif360-learn/tests/../examples/demo_reweighing_preproc.ipynb to notebook
[NbConvertApp] Executing notebook with kernel: python3
[NbConvertApp] Writing 57583 bytes to /tmp/tmp8xe484rd.ipynb
_________________________________________________________________________ test_repair0 _________________________________________________________________________
def test_repair0():
ad = AdultDataset(protected_attribute_names=['sex'],
privileged_classes=[['Male']], categorical_features=[],
features_to_keep=['age', 'education-num'])
> di = DisparateImpactRemover(repair_level=0.)
../tests/test_disparate_impact_remover.py:17:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <aiflearn.algorithms.preprocessing.disparate_impact_remover.DisparateImpactRemover object at 0x7ff158379438>, repair_level = 0.0
sensitive_attribute = None
def __init__(self, repair_level=1.0, sensitive_attribute=None):
super(DisparateImpactRemover, self).__init__(repair_level=repair_level)
# avoid importing early since this package can throw warnings in some
# jupyter notebooks
> from BlackBoxAuditing.repairers.GeneralRepairer import Repairer
E ImportError: No module named 'BlackBoxAuditing'
../aiflearn/algorithms/preprocessing/disparate_impact_remover.py:36: ImportError
__________________________________________________________________________ test_adult __________________________________________________________________________
def test_adult():
protected = 'sex'
ad = AdultDataset(protected_attribute_names=[protected],
privileged_classes=[['Male']], categorical_features=[],
features_to_keep=['age', 'education-num', 'capital-gain',
'capital-loss', 'hours-per-week'])
scaler = MinMaxScaler(copy=False)
# ad.features = scaler.fit_transform(ad.features)
train, test = ad.split([32561])
assert np.any(test.labels)
train.features = scaler.fit_transform(train.features)
test.features = scaler.transform(test.features)
index = train.feature_names.index(protected)
X_tr = np.delete(train.features, index, axis=1)
X_te = np.delete(test.features, index, axis=1)
y_tr = train.labels.ravel()
> di = DisparateImpactRemover(repair_level=1.0)
../tests/test_disparate_impact_remover.py:43:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <aiflearn.algorithms.preprocessing.disparate_impact_remover.DisparateImpactRemover object at 0x7ff15736a320>, repair_level = 1.0
sensitive_attribute = None
def __init__(self, repair_level=1.0, sensitive_attribute=None):
super(DisparateImpactRemover, self).__init__(repair_level=repair_level)
# avoid importing early since this package can throw warnings in some
# jupyter notebooks
> from BlackBoxAuditing.repairers.GeneralRepairer import Repairer
E ImportError: No module named 'BlackBoxAuditing'
../aiflearn/algorithms/preprocessing/disparate_impact_remover.py:36: ImportError
======================================================================= warnings summary =======================================================================
tests/test_lfr.py::test_fit_isnumpy
/home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py:67: NumbaWarning:
Compilation is falling back to object mode WITH looplifting enabled because Function "LFR_optim_obj" failed type inference due to: Unknown attribute 'iters' of type recursive(type(CPUDispatcher(<function LFR_optim_obj at 0x7ff133bdc510>)))
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 71:
def LFR_optim_obj(params, data_sensitive, data_nonsensitive, y_sensitive,
<source elided>
LFR_optim_obj.iters += 1
^
[1] During: typing of get attribute at /home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py (71)
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 71:
def LFR_optim_obj(params, data_sensitive, data_nonsensitive, y_sensitive,
<source elided>
LFR_optim_obj.iters += 1
^
@jit
tests/test_lfr.py::test_fit_isnumpy
/home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py:67: NumbaWarning:
Compilation is falling back to object mode WITHOUT looplifting enabled because Function "LFR_optim_obj" failed type inference due to: cannot determine Numba type of <class 'numba.dispatcher.LiftedLoop'>
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 90:
def LFR_optim_obj(params, data_sensitive, data_nonsensitive, y_sensitive,
<source elided>
L_z = 0.0
for j in range(k):
^
@jit
tests/test_lfr.py::test_fit_isnumpy
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:742: NumbaWarning: Function "LFR_optim_obj" was compiled in object mode without forceobj=True, but has lifted loops.
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 68:
@jit
def LFR_optim_obj(params, data_sensitive, data_nonsensitive, y_sensitive,
^
self.func_ir.loc))
tests/test_lfr.py::test_fit_isnumpy
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:751: NumbaDeprecationWarning:
Fall-back from the nopython compilation path to the object mode compilation path has been detected, this is deprecated behaviour.
For more information visit http://numba.pydata.org/numba-doc/latest/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 68:
@jit
def LFR_optim_obj(params, data_sensitive, data_nonsensitive, y_sensitive,
^
warnings.warn(errors.NumbaDeprecationWarning(msg, self.func_ir.loc))
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py:55: NumbaWarning:
Compilation is falling back to object mode WITH looplifting enabled because Function "yhat" failed type inference due to: Invalid use of Function(<built-in function iadd>) with argument(s) of type(s): (float64, array(float64, 1d, C))
Known signatures:
* (int64, int64) -> int64
* (int64, uint64) -> int64
* (uint64, int64) -> int64
* (uint64, uint64) -> uint64
* (float32, float32) -> float32
* (float64, float64) -> float64
* (complex64, complex64) -> complex64
* (complex128, complex128) -> complex128
* parameterized
In definition 0:
All templates rejected with literals.
In definition 1:
All templates rejected without literals.
In definition 2:
All templates rejected with literals.
In definition 3:
All templates rejected without literals.
In definition 4:
All templates rejected with literals.
In definition 5:
All templates rejected without literals.
In definition 6:
All templates rejected with literals.
In definition 7:
All templates rejected without literals.
In definition 8:
All templates rejected with literals.
In definition 9:
All templates rejected without literals.
In definition 10:
All templates rejected with literals.
In definition 11:
All templates rejected without literals.
In definition 12:
All templates rejected with literals.
In definition 13:
All templates rejected without literals.
This error is usually caused by passing an argument of a type that is unsupported by the named function.
[1] During: typing of intrinsic-call at /home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py (64)
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 64:
def yhat(M_nk, y, w, N, k):
<source elided>
yhat[i] = 0.999 if yhat[i] >= 1 else yhat[i]
L_y += -1 * y[i] * np.log(yhat[i]) - (1.0 - y[i]) * np.log(1.0 - yhat[i])
^
@jit
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py:55: NumbaWarning:
Compilation is falling back to object mode WITHOUT looplifting enabled because Function "yhat" failed type inference due to: cannot determine Numba type of <class 'numba.dispatcher.LiftedLoop'>
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 59:
def yhat(M_nk, y, w, N, k):
<source elided>
L_y = 0.0
for i in range(N):
^
@jit
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:742: NumbaWarning: Function "yhat" was compiled in object mode without forceobj=True, but has lifted loops.
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 56:
@jit
def yhat(M_nk, y, w, N, k):
^
self.func_ir.loc))
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:751: NumbaDeprecationWarning:
Fall-back from the nopython compilation path to the object mode compilation path has been detected, this is deprecated behaviour.
For more information visit http://numba.pydata.org/numba-doc/latest/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 56:
@jit
def yhat(M_nk, y, w, N, k):
^
warnings.warn(errors.NumbaDeprecationWarning(msg, self.func_ir.loc))
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py:55: NumbaWarning:
Compilation is falling back to object mode WITHOUT looplifting enabled because Function "yhat" failed type inference due to: Invalid use of Function(<built-in function iadd>) with argument(s) of type(s): (float64, array(float64, 1d, C))
Known signatures:
* (int64, int64) -> int64
* (int64, uint64) -> int64
* (uint64, int64) -> int64
* (uint64, uint64) -> uint64
* (float32, float32) -> float32
* (float64, float64) -> float64
* (complex64, complex64) -> complex64
* (complex128, complex128) -> complex128
* parameterized
In definition 0:
All templates rejected with literals.
In definition 1:
All templates rejected without literals.
In definition 2:
All templates rejected with literals.
In definition 3:
All templates rejected without literals.
In definition 4:
All templates rejected with literals.
In definition 5:
All templates rejected without literals.
In definition 6:
All templates rejected with literals.
In definition 7:
All templates rejected without literals.
In definition 8:
All templates rejected with literals.
In definition 9:
All templates rejected without literals.
In definition 10:
All templates rejected with literals.
In definition 11:
All templates rejected without literals.
In definition 12:
All templates rejected with literals.
In definition 13:
All templates rejected without literals.
This error is usually caused by passing an argument of a type that is unsupported by the named function.
[1] During: typing of intrinsic-call at /home/codeminer42/Code/open-source/aif360-learn/aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py (64)
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 64:
def yhat(M_nk, y, w, N, k):
<source elided>
yhat[i] = 0.999 if yhat[i] >= 1 else yhat[i]
L_y += -1 * y[i] * np.log(yhat[i]) - (1.0 - y[i]) * np.log(1.0 - yhat[i])
^
@jit
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:742: NumbaWarning: Function "yhat" was compiled in object mode without forceobj=True.
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 59:
def yhat(M_nk, y, w, N, k):
<source elided>
L_y = 0.0
for i in range(N):
^
self.func_ir.loc))
tests/test_lfr.py::test_transform_protecteddataset
/home/codeminer42/miniconda3/envs/aiflearn/lib/python3.5/site-packages/numba/compiler.py:751: NumbaDeprecationWarning:
Fall-back from the nopython compilation path to the object mode compilation path has been detected, this is deprecated behaviour.
For more information visit http://numba.pydata.org/numba-doc/latest/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit
File "../aiflearn/algorithms/preprocessing/lfr_helpers/helpers.py", line 59:
def yhat(M_nk, y, w, N, k):
<source elided>
L_y = 0.0
for i in range(N):
^
warnings.warn(errors.NumbaDeprecationWarning(msg, self.func_ir.loc))
-- Docs: https://docs.pytest.org/en/latest/warnings.html
==================================================== 8 failed, 29 passed, 11 warnings in 906.57s (0:15:06) =====================================================
RUNNING THE L-BFGS-B CODE
* * *
Machine precision = 2.220D-16
N = 131 M = 10
At X0 0 variables are exactly at the bounds
At iterate 0 f= 2.26526D+04 |proj g|= 1.58260D+03
At iterate 1 f= 2.04189D+04 |proj g|= 4.91265D+02
At iterate 2 f= 2.01309D+04 |proj g|= 4.96682D+02
At iterate 3 f= 1.97260D+04 |proj g|= 4.65151D+02
At iterate 4 f= 1.90393D+04 |proj g|= 8.07770D+02
At iterate 5 f= 1.88301D+04 |proj g|= 7.56099D+02
At iterate 6 f= 1.87428D+04 |proj g|= 8.94691D+02
At iterate 7 f= 1.83953D+04 |proj g|= 8.81409D+02
At iterate 8 f= 1.75778D+04 |proj g|= 1.46632D+03
At iterate 9 f= 1.73528D+04 |proj g|= 8.93699D+02
At iterate 10 f= 1.72061D+04 |proj g|= 9.64026D+02
At iterate 11 f= 1.68696D+04 |proj g|= 9.75129D+02
At iterate 12 f= 1.66957D+04 |proj g|= 6.94447D+02
At iterate 13 f= 1.64707D+04 |proj g|= 8.57689D+02
At iterate 14 f= 1.61611D+04 |proj g|= 3.70558D+02
At iterate 15 f= 1.60457D+04 |proj g|= 5.00958D+02
At iterate 16 f= 1.59239D+04 |proj g|= 4.47909D+02
At iterate 17 f= 1.58222D+04 |proj g|= 9.56907D+02
At iterate 18 f= 1.57106D+04 |proj g|= 6.13336D+02
At iterate 19 f= 1.56562D+04 |proj g|= 3.55303D+02
At iterate 20 f= 1.55893D+04 |proj g|= 3.50948D+02
At iterate 21 f= 1.55119D+04 |proj g|= 2.49244D+02
At iterate 22 f= 1.54403D+04 |proj g|= 2.58441D+02
At iterate 23 f= 1.53811D+04 |proj g|= 2.76940D+02
At iterate 24 f= 1.53437D+04 |proj g|= 5.52288D+02
At iterate 25 f= 1.52968D+04 |proj g|= 2.03731D+02
At iterate 26 f= 1.52443D+04 |proj g|= 2.27741D+02
At iterate 27 f= 1.52216D+04 |proj g|= 2.62413D+02
At iterate 28 f= 1.51943D+04 |proj g|= 2.26614D+02
At iterate 29 f= 1.51479D+04 |proj g|= 1.08968D+02
At iterate 30 f= 1.51290D+04 |proj g|= 1.41082D+02
At iterate 31 f= 1.50983D+04 |proj g|= 1.44776D+02
At iterate 32 f= 1.50906D+04 |proj g|= 1.54591D+02
* * *
Tit = total number of iterations
Tnf = total number of function evaluations
Tnint = total number of segments explored during Cauchy searches
Skip = number of BFGS updates skipped
Nact = number of active bounds at final generalized Cauchy point
Projg = norm of the final projected gradient
F = final function value
* * *
N Tit Tnf Tnint Skip Nact Projg F
131 32 38 41 0 3 1.546D+02 1.509D+04
F = 15090.552840313529
STOP: TOTAL NO. of f AND g EVALUATIONS EXCEEDS LIMIT
Cauchy time 0.000E+00 seconds.
Subspace minimization time 0.000E+00 seconds.
Line search time 0.000E+00 seconds.
Total User time 0.000E+00 seconds.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment