Skip to content

Instantly share code, notes, and snippets.

@bsipocz
Created November 2, 2021 21:43
Show Gist options
  • Save bsipocz/5518a418de782f2ea208af321e86de4d to your computer and use it in GitHub Desktop.
Save bsipocz/5518a418de782f2ea208af321e86de4d to your computer and use it in GitHub Desktop.
astroquery.simbad remote failures 20211102
$ pt -P simbad -R
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/setuptools/distutils_patch.py:25: UserWarning: Distutils was imported before Setuptools. This usage is discouraged and may exhibit undesirable behaviors or errors. Please use Setuptools' objects directly or at least import Setuptools first.
warnings.warn(
Freezing version number to ./astroquery/version.py
running test
INFO: installing to temporary directory: /private/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/astroquery-test-hzkpzids [astropy.tests.command]
no previously-included directories found matching 'build'
no previously-included directories found matching 'docs/_build'
no previously-included directories found matching 'docs/api'
no previously-included directories found matching 'astropy_helpers/build'
warning: no previously-included files matching '*.o' found anywhere in distribution
WARNING: AstropyDeprecationWarning: The remote_data option should be one of none/astropy/any (found =any). For backward-compatibility, assuming 'any', but you should change the option to be one of the supported ones to avoid issues in future. [astropy.tests.runner]
================================================= test session starts ==================================================
platform darwin -- Python 3.9.1, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
Running tests with astroquery version 0.4.4.dev7065_testrun.
Running tests in astroquery/simbad docs/simbad.
Date: 2021-11-02T14:39:23
Platform: macOS-10.13.6-x86_64-i386-64bit
Executable: /Users/bsipocz/.pyenv/versions/3.9.1/bin/python
Full Python Version:
3.9.1 (default, Jan 11 2021, 15:50:22)
[Clang 10.0.0 (clang-1000.11.45.5)]
encodings: sys: utf-8, locale: UTF-8, filesystem: utf-8
byteorder: little
float info: dig: 15, mant_dig: 15
Package versions:
Numpy: 1.19.5
Matplotlib: 3.3.3
Astropy: 4.3.1
APLpy: not available
pyregion: not available
regions: 0.4
pyVO: 1.1
mocpy: 0.8.5
astropy-healpix: 0.6
vamdclib: not available
astropy-helpers: 4.0.1
Using Astropy options: remote_data: any.
rootdir: /private/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/astroquery-test-hzkpzids/lib/python3.9/site-packages, configfile: setup.cfg
plugins: doctestplus-0.11.0, cov-3.0.0, arraydiff-0.3, remotedata-0.3.2, hypothesis-6.3.1, filter-subpackage-0.1.1, openfiles-0.5.0, astropy-header-0.1.2
collected 90 items
astroquery/simbad/core.py .... [ 4%]
astroquery/simbad/tests/test_simbad.py ......................................................... [ 67%]
astroquery/simbad/tests/test_simbad_remote.py FFFFFFFFFFFFFFFFFFFFFFFFFF.. [ 98%]
docs/simbad/simbad.rst s [100%]
======================================================= FAILURES =======================================================
___________________________________________ TestSimbad.test_query_criteria1 ____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f5d3190>
def test_query_criteria1(self):
> result = simbad.core.Simbad.query_criteria(
"region(box, GAL, 49.89 -0.3, 0.5d 0.5d)", otype='HII')
astroquery/simbad/tests/test_simbad_remote.py:25:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:489: in query_criteria
result = self.query_criteria_async(*args, **kwargs)
astroquery/simbad/core.py:516: in query_criteria_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/7523943f40983b5d6c4e0651f5e4943ed09fce627dada45b9fda2979.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
___________________________________________ TestSimbad.test_query_criteria2 ____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f580790>
def test_query_criteria2(self):
> result = simbad.core.Simbad.query_criteria(otype='SNR')
astroquery/simbad/tests/test_simbad_remote.py:30:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:489: in query_criteria
result = self.query_criteria_async(*args, **kwargs)
astroquery/simbad/core.py:516: in query_criteria_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/30c92b5ac737299a5dd0c393503e5a473c25157f86be8da260aa6ca3.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_________________________________________ TestSimbad.test_query_bibcode_async __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f6619a0>
def test_query_bibcode_async(self):
> response = simbad.core.Simbad.query_bibcode_async(
'2006ApJ*', wildcard=True)
astroquery/simbad/tests/test_simbad_remote.py:34:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:898: in query_bibcode_async
response = self._request("POST", self.SIMBAD_URL, cache=cache,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/ee90b7d28004ea512833b28404e611af87693174102a2525f468016e.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
____________________________________________ TestSimbad.test_query_bibcode _____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f528850>
def test_query_bibcode(self):
> result = simbad.core.Simbad.query_bibcode('2006ApJ*', wildcard=True)
astroquery/simbad/tests/test_simbad_remote.py:47:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:858: in query_bibcode
response = self.query_bibcode_async(bibcode, wildcard=wildcard,
astroquery/simbad/core.py:898: in query_bibcode_async
response = self._request("POST", self.SIMBAD_URL, cache=cache,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:311: in _request
response = query.from_cache(self.cache_location)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <astroquery.query.AstroQuery object at 0x14f5285e0>
cache_location = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad'
def from_cache(self, cache_location):
request_file = self.request_file(cache_location)
try:
with open(request_file, "rb") as f:
> response = pickle.load(f)
E EOFError: Ran out of input
astroquery/query.py:109: EOFError
__________________________________________ TestSimbad.test_query_bibobj_async __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f5d3790>
def test_query_bibobj_async(self):
> response = simbad.core.Simbad.query_bibobj_async('2005A&A.430.165F')
astroquery/simbad/tests/test_simbad_remote.py:51:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:830: in query_bibobj_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/32234dbcaf90718dca0e04b2bdaba248fa098eae7537e4308994067d.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________________ TestSimbad.test_query_bibobj _____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f11e640>
def test_query_bibobj(self):
> result = simbad.core.Simbad.query_bibobj('2005A&A.430.165F')
astroquery/simbad/tests/test_simbad_remote.py:55:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:797: in query_bibobj
response = self.query_bibobj_async(bibcode,
astroquery/simbad/core.py:830: in query_bibobj_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:311: in _request
response = query.from_cache(self.cache_location)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <astroquery.query.AstroQuery object at 0x14f11eac0>
cache_location = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad'
def from_cache(self, cache_location):
request_file = self.request_file(cache_location)
try:
with open(request_file, "rb") as f:
> response = pickle.load(f)
E EOFError: Ran out of input
astroquery/query.py:109: EOFError
_________________________________________ TestSimbad.test_query_catalog_async __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f673490>
def test_query_catalog_async(self):
> response = simbad.core.Simbad.query_catalog_async('m')
astroquery/simbad/tests/test_simbad_remote.py:59:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:775: in query_catalog_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/7397a8faf035276c70e1a427a44698424cc6cac60a97faf261b1affa.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
____________________________________________ TestSimbad.test_query_catalog _____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f58b4c0>
def test_query_catalog(self):
> result = simbad.core.Simbad.query_catalog('m')
astroquery/simbad/tests/test_simbad_remote.py:63:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:743: in query_catalog
response = self.query_catalog_async(catalog, cache=cache,
astroquery/simbad/core.py:775: in query_catalog_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:311: in _request
response = query.from_cache(self.cache_location)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <astroquery.query.AstroQuery object at 0x14f58bbe0>
cache_location = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad'
def from_cache(self, cache_location):
request_file = self.request_file(cache_location)
try:
with open(request_file, "rb") as f:
> response = pickle.load(f)
E EOFError: Ran out of input
astroquery/query.py:109: EOFError
__________________________________________ TestSimbad.test_query_region_async __________________________________________
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f5dbee0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
httplib_response = conn.getresponse()
except BaseException as e:
# Remove the TypeError from the exception chain in
# Python 3 (including for exceptions like SystemExit).
# Otherwise it looks like a bug in the code.
> six.raise_from(e, None)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:445:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = None, from_value = None
> ???
<string>:3:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f5dbee0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
> httplib_response = conn.getresponse()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:440:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x14f5dbee0>
def getresponse(self):
"""Get the response from the server.
If the HTTPConnection is in the correct state, returns an
instance of HTTPResponse or of whatever object is returned by
the response_class variable.
If a request has not been sent or if a previous response has
not be handled, ResponseNotReady is raised. If the HTTP
response indicates that the connection should be closed, then
it will be closed before the response is returned. When the
connection is closed, the underlying socket is closed.
"""
# if a prior response has been completed, then forget about it.
if self.__response and self.__response.isclosed():
self.__response = None
# if a prior response exists, then it must be completed (otherwise, we
# cannot read this response's header to determine the connection-close
# behavior)
#
# note: if a prior response existed, but was connection-close, then the
# socket and response were made independent of this HTTPConnection
# object since a new request requires that we open a whole new
# connection
#
# this means the prior response had one of two states:
# 1) will_close: this connection was reset and the prior socket and
# response operate independently
# 2) persistent: the response was retained and we await its
# isclosed() status to become true.
#
if self.__state != _CS_REQ_SENT or self.__response:
raise ResponseNotReady(self.__state)
if self.debuglevel > 0:
response = self.response_class(self.sock, self.debuglevel,
method=self._method)
else:
response = self.response_class(self.sock, method=self._method)
try:
try:
> response.begin()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:1347:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <http.client.HTTPResponse object at 0x14f2e0c40>
def begin(self):
if self.headers is not None:
# we've already started reading the response
return
# read until we get a non-100 response
while True:
> version, status, reason = self._read_status()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:307:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <http.client.HTTPResponse object at 0x14f2e0c40>
def _read_status(self):
> line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:268:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <socket.SocketIO object at 0x14f2e05b0>, b = <memory at 0x14c0dd040>
def readinto(self, b):
"""Read up to len(b) bytes into the writable buffer *b* and return
the number of bytes read. If the socket is non-blocking and no bytes
are available, None is returned.
If *b* is non-empty, a 0 return value indicates that the connection
was shutdown at the other end.
"""
self._checkClosed()
self._checkReadable()
if self._timeout_occurred:
raise OSError("cannot read from timed out object")
while True:
try:
> return self._sock.recv_into(b)
E socket.timeout: timed out
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/socket.py:704: timeout
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x14f0d1af0>, request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=60, read=60, total=None), verify = True, cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, method = 'POST', url = '/simbad/sim-script'
body = 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5.0d+frame%3DICRS+equi%3D2000.0%0Avotable+close'
headers = {'User-Agent': 'astroquery/0.4.4.dev7065_testrun python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept'...ID=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None), redirect = False
assert_same_host = False, timeout = Timeout(connect=60, read=60, total=None), pool_timeout = None, release_conn = False
chunked = False, body_pos = None, response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/simbad/sim-script', query=None, fragment=None)
destination_scheme = None, conn = None, release_this_conn = True, http_tunnel_required = False, err = None
clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None), method = 'POST', url = '/simbad/sim-script'
response = None
error = ReadTimeoutError("HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)")
_pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
_stacktrace = <traceback object at 0x14f5a93c0>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
> raise six.reraise(type(error), error, _stacktrace)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/util/retry.py:531:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tp = <class 'urllib3.exceptions.ReadTimeoutError'>, value = None, tb = None
def reraise(tp, value, tb=None):
try:
if value is None:
value = tp()
if value.__traceback__ is not tb:
raise value.with_traceback(tb)
> raise value
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/packages/six.py:735:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, method = 'POST', url = '/simbad/sim-script'
body = 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5.0d+frame%3DICRS+equi%3D2000.0%0Avotable+close'
headers = {'User-Agent': 'astroquery/0.4.4.dev7065_testrun python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept'...ID=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None), redirect = False
assert_same_host = False, timeout = Timeout(connect=60, read=60, total=None), pool_timeout = None, release_conn = False
chunked = False, body_pos = None, response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/simbad/sim-script', query=None, fragment=None)
destination_scheme = None, conn = None, release_this_conn = True, http_tunnel_required = False, err = None
clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f5dbee0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
httplib_response = conn.getresponse()
except BaseException as e:
# Remove the TypeError from the exception chain in
# Python 3 (including for exceptions like SystemExit).
# Otherwise it looks like a bug in the code.
six.raise_from(e, None)
except (SocketTimeout, BaseSSLError, SocketError) as e:
> self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:447:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, err = timeout('timed out')
url = '/simbad/sim-script', timeout_value = 60
def _raise_timeout(self, err, url, timeout_value):
"""Is the error actually a timeout? Will raise a ReadTimeout or pass"""
if isinstance(err, SocketTimeout):
> raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % timeout_value
)
E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:336: ReadTimeoutError
During handling of the above exception, another exception occurred:
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f2e0670>
def test_query_region_async(self):
> response = simbad.core.Simbad.query_region_async(
ICRS_COORDS_M42, radius=5 * u.deg, equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:313: in _request
response = query.request(self._session,
astroquery/query.py:66: in request
return session.request(self.method, self.url, params=self.params,
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x14f0d1af0>, request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=60, read=60, total=None), verify = True, cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
raise ConnectionError(e, request=request)
except ClosedPoolError as e:
raise ConnectionError(e, request=request)
except _ProxyError as e:
raise ProxyError(e)
except (_SSLError, _HTTPError) as e:
if isinstance(e, _SSLError):
# This branch is for urllib3 versions earlier than v1.22
raise SSLError(e, request=request)
elif isinstance(e, ReadTimeoutError):
> raise ReadTimeout(e, request=request)
E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/adapters.py:529: ReadTimeout
______________________________________ TestSimbad.test_query_region_async_vector _______________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f9ba910>
def test_query_region_async_vector(self):
> response1 = simbad.core.Simbad.query_region_async(multicoords,
radius=0.5*u.arcsec)
astroquery/simbad/tests/test_simbad_remote.py:73:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/805025026ce31c15b325920b28386a7e578ce1705d9437adc1e38acf.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________________ TestSimbad.test_query_region _____________________________________________
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f9b4fa0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
httplib_response = conn.getresponse()
except BaseException as e:
# Remove the TypeError from the exception chain in
# Python 3 (including for exceptions like SystemExit).
# Otherwise it looks like a bug in the code.
> six.raise_from(e, None)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:445:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
value = None, from_value = None
> ???
<string>:3:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f9b4fa0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
> httplib_response = conn.getresponse()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:440:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connection.HTTPConnection object at 0x14f9b4fa0>
def getresponse(self):
"""Get the response from the server.
If the HTTPConnection is in the correct state, returns an
instance of HTTPResponse or of whatever object is returned by
the response_class variable.
If a request has not been sent or if a previous response has
not be handled, ResponseNotReady is raised. If the HTTP
response indicates that the connection should be closed, then
it will be closed before the response is returned. When the
connection is closed, the underlying socket is closed.
"""
# if a prior response has been completed, then forget about it.
if self.__response and self.__response.isclosed():
self.__response = None
# if a prior response exists, then it must be completed (otherwise, we
# cannot read this response's header to determine the connection-close
# behavior)
#
# note: if a prior response existed, but was connection-close, then the
# socket and response were made independent of this HTTPConnection
# object since a new request requires that we open a whole new
# connection
#
# this means the prior response had one of two states:
# 1) will_close: this connection was reset and the prior socket and
# response operate independently
# 2) persistent: the response was retained and we await its
# isclosed() status to become true.
#
if self.__state != _CS_REQ_SENT or self.__response:
raise ResponseNotReady(self.__state)
if self.debuglevel > 0:
response = self.response_class(self.sock, self.debuglevel,
method=self._method)
else:
response = self.response_class(self.sock, method=self._method)
try:
try:
> response.begin()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:1347:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <http.client.HTTPResponse object at 0x14f30bc10>
def begin(self):
if self.headers is not None:
# we've already started reading the response
return
# read until we get a non-100 response
while True:
> version, status, reason = self._read_status()
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:307:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <http.client.HTTPResponse object at 0x14f30bc10>
def _read_status(self):
> line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/http/client.py:268:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <socket.SocketIO object at 0x14f30bbb0>, b = <memory at 0x14f1f9c40>
def readinto(self, b):
"""Read up to len(b) bytes into the writable buffer *b* and return
the number of bytes read. If the socket is non-blocking and no bytes
are available, None is returned.
If *b* is non-empty, a 0 return value indicates that the connection
was shutdown at the other end.
"""
self._checkClosed()
self._checkReadable()
if self._timeout_occurred:
raise OSError("cannot read from timed out object")
while True:
try:
> return self._sock.recv_into(b)
E socket.timeout: timed out
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/socket.py:704: timeout
During handling of the above exception, another exception occurred:
self = <requests.adapters.HTTPAdapter object at 0x14f0d1af0>, request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=60, read=60, total=None), verify = True, cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
> resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/adapters.py:439:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, method = 'POST', url = '/simbad/sim-script'
body = 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5.0d+frame%3DICRS+equi%3D2000.0%0Avotable+close'
headers = {'User-Agent': 'astroquery/0.4.4.dev7065_testrun python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept'...ID=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None), redirect = False
assert_same_host = False, timeout = Timeout(connect=60, read=60, total=None), pool_timeout = None, release_conn = False
chunked = False, body_pos = None, response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/simbad/sim-script', query=None, fragment=None)
destination_scheme = None, conn = None, release_this_conn = True, http_tunnel_required = False, err = None
clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
# If we're going to release the connection in ``finally:``, then
# the response doesn't need to know about the connection. Otherwise
# it will also try to release it and we'll have a double-release
# mess.
response_conn = conn if not release_conn else None
# Pass method to Response for length checking
response_kw["request_method"] = method
# Import httplib's response into our own wrapper object
response = self.ResponseCls.from_httplib(
httplib_response,
pool=self,
connection=response_conn,
retries=retries,
**response_kw
)
# Everything went great!
clean_exit = True
except EmptyPoolError:
# Didn't get a connection from the pool, no need to clean up
clean_exit = True
release_this_conn = False
raise
except (
TimeoutError,
HTTPException,
SocketError,
ProtocolError,
BaseSSLError,
SSLError,
CertificateError,
) as e:
# Discard the connection for these exceptions. It will be
# replaced during the next _get_conn() call.
clean_exit = False
if isinstance(e, (BaseSSLError, CertificateError)):
e = SSLError(e)
elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy:
e = ProxyError("Cannot connect to proxy.", e)
elif isinstance(e, (SocketError, HTTPException)):
e = ProtocolError("Connection aborted.", e)
> retries = retries.increment(
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:755:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = Retry(total=0, connect=None, read=False, redirect=None, status=None), method = 'POST', url = '/simbad/sim-script'
response = None
error = ReadTimeoutError("HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)")
_pool = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
_stacktrace = <traceback object at 0x14f5a8d80>
def increment(
self,
method=None,
url=None,
response=None,
error=None,
_pool=None,
_stacktrace=None,
):
"""Return a new Retry object with incremented retry counters.
:param response: A response object, or None, if the server did not
return a response.
:type response: :class:`~urllib3.response.HTTPResponse`
:param Exception error: An error encountered during the request, or
None if the response was received successfully.
:return: A new ``Retry`` object.
"""
if self.total is False and error:
# Disabled, indicate to re-raise the error.
raise six.reraise(type(error), error, _stacktrace)
total = self.total
if total is not None:
total -= 1
connect = self.connect
read = self.read
redirect = self.redirect
status_count = self.status
other = self.other
cause = "unknown"
status = None
redirect_location = None
if error and self._is_connection_error(error):
# Connect retry?
if connect is False:
raise six.reraise(type(error), error, _stacktrace)
elif connect is not None:
connect -= 1
elif error and self._is_read_error(error):
# Read retry?
if read is False or not self._is_method_retryable(method):
> raise six.reraise(type(error), error, _stacktrace)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/util/retry.py:531:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tp = <class 'urllib3.exceptions.ReadTimeoutError'>, value = None, tb = None
def reraise(tp, value, tb=None):
try:
if value is None:
value = tp()
if value.__traceback__ is not tb:
raise value.with_traceback(tb)
> raise value
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/packages/six.py:735:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, method = 'POST', url = '/simbad/sim-script'
body = 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5.0d+frame%3DICRS+equi%3D2000.0%0Avotable+close'
headers = {'User-Agent': 'astroquery/0.4.4.dev7065_testrun python-requests/2.25.1', 'Accept-Encoding': 'gzip, deflate', 'Accept'...ID=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}
retries = Retry(total=0, connect=None, read=False, redirect=None, status=None), redirect = False
assert_same_host = False, timeout = Timeout(connect=60, read=60, total=None), pool_timeout = None, release_conn = False
chunked = False, body_pos = None, response_kw = {'decode_content': False, 'preload_content': False}
parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/simbad/sim-script', query=None, fragment=None)
destination_scheme = None, conn = None, release_this_conn = True, http_tunnel_required = False, err = None
clean_exit = False
def urlopen(
self,
method,
url,
body=None,
headers=None,
retries=None,
redirect=True,
assert_same_host=True,
timeout=_Default,
pool_timeout=None,
release_conn=None,
chunked=False,
body_pos=None,
**response_kw
):
"""
Get a connection from the pool and perform an HTTP request. This is the
lowest level call for making a request, so you'll need to specify all
the raw details.
.. note::
More commonly, it's appropriate to use a convenience method provided
by :class:`.RequestMethods`, such as :meth:`request`.
.. note::
`release_conn` will only behave as expected if
`preload_content=False` because we want to make
`preload_content=False` the default behaviour someday soon without
breaking backwards compatibility.
:param method:
HTTP request method (such as GET, POST, PUT, etc.)
:param url:
The URL to perform the request on.
:param body:
Data to send in the request body, either :class:`str`, :class:`bytes`,
an iterable of :class:`str`/:class:`bytes`, or a file-like object.
:param headers:
Dictionary of custom headers to send, such as User-Agent,
If-None-Match, etc. If None, pool headers are used. If provided,
these headers completely replace any pool-specific headers.
:param retries:
Configure the number of retries to allow before raising a
:class:`~urllib3.exceptions.MaxRetryError` exception.
Pass ``None`` to retry until you receive a response. Pass a
:class:`~urllib3.util.retry.Retry` object for fine-grained control
over different types of retries.
Pass an integer number to retry connection errors that many times,
but no other types of errors. Pass zero to never retry.
If ``False``, then retries are disabled and any exception is raised
immediately. Also, instead of raising a MaxRetryError on redirects,
the redirect response will be returned.
:type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
:param redirect:
If True, automatically handle redirects (status codes 301, 302,
303, 307, 308). Each redirect counts as a retry. Disabling retries
will disable redirect, too.
:param assert_same_host:
If ``True``, will make sure that the host of the pool requests is
consistent else will raise HostChangedError. When ``False``, you can
use the pool on an HTTP proxy and request foreign hosts.
:param timeout:
If specified, overrides the default timeout for this one
request. It may be a float (in seconds) or an instance of
:class:`urllib3.util.Timeout`.
:param pool_timeout:
If set and the pool is set to block=True, then this method will
block for ``pool_timeout`` seconds and raise EmptyPoolError if no
connection is available within the time period.
:param release_conn:
If False, then the urlopen call will not release the connection
back into the pool once a response is received (but will release if
you read the entire contents of the response such as when
`preload_content=True`). This is useful if you're not preloading
the response's content immediately. You will need to call
``r.release_conn()`` on the response ``r`` to return the connection
back into the pool. If None, it takes the value of
``response_kw.get('preload_content', True)``.
:param chunked:
If True, urllib3 will send the body using chunked transfer
encoding. Otherwise, urllib3 will send the body using the standard
content-length form. Defaults to False.
:param int body_pos:
Position to seek to in file-like body in the event of a retry or
redirect. Typically this won't need to be set because urllib3 will
auto-populate the value when needed.
:param \\**response_kw:
Additional parameters are passed to
:meth:`urllib3.response.HTTPResponse.from_httplib`
"""
parsed_url = parse_url(url)
destination_scheme = parsed_url.scheme
if headers is None:
headers = self.headers
if not isinstance(retries, Retry):
retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
if release_conn is None:
release_conn = response_kw.get("preload_content", True)
# Check host
if assert_same_host and not self.is_same_host(url):
raise HostChangedError(self, url, retries)
# Ensure that the URL we're connecting to is properly encoded
if url.startswith("/"):
url = six.ensure_str(_encode_target(url))
else:
url = six.ensure_str(parsed_url.url)
conn = None
# Track whether `conn` needs to be released before
# returning/raising/recursing. Update this variable if necessary, and
# leave `release_conn` constant throughout the function. That way, if
# the function recurses, the original value of `release_conn` will be
# passed down into the recursive call, and its value will be respected.
#
# See issue #651 [1] for details.
#
# [1] <https://github.com/urllib3/urllib3/issues/651>
release_this_conn = release_conn
http_tunnel_required = connection_requires_http_tunnel(
self.proxy, self.proxy_config, destination_scheme
)
# Merge the proxy headers. Only done when not using HTTP CONNECT. We
# have to copy the headers dict so we can safely change it without those
# changes being reflected in anyone else's copy.
if not http_tunnel_required:
headers = headers.copy()
headers.update(self.proxy_headers)
# Must keep the exception bound to a separate variable or else Python 3
# complains about UnboundLocalError.
err = None
# Keep track of whether we cleanly exited the except block. This
# ensures we do proper cleanup in finally.
clean_exit = False
# Rewind body position, if needed. Record current position
# for future rewinds in the event of a redirect/retry.
body_pos = set_file_position(body, body_pos)
try:
# Request a connection from the queue.
timeout_obj = self._get_timeout(timeout)
conn = self._get_conn(timeout=pool_timeout)
conn.timeout = timeout_obj.connect_timeout
is_new_proxy_conn = self.proxy is not None and not getattr(
conn, "sock", None
)
if is_new_proxy_conn and http_tunnel_required:
self._prepare_proxy(conn)
# Make the request on the httplib connection object.
> httplib_response = self._make_request(
conn,
method,
url,
timeout=timeout_obj,
body=body,
headers=headers,
chunked=chunked,
)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:699:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>
conn = <urllib3.connection.HTTPConnection object at 0x14f9b4fa0>, method = 'POST', url = '/simbad/sim-script'
timeout = Timeout(connect=60, read=60, total=None), chunked = False
httplib_request_kw = {'body': 'script=votable+%7Bmain_id%2Ccoordinates%7D%0Avotable+open%0Aquery+coo+5%3A35%3A17.3+-80%3A52%3A00+radius%3D5...D=E196137FBC488FCE1627742B2CE3FE63.new', 'Content-Length': '154', 'Content-Type': 'application/x-www-form-urlencoded'}}
timeout_obj = Timeout(connect=60, read=60, total=None), read_timeout = 60
def _make_request(
self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw
):
"""
Perform a request on a given urllib connection object taken from our
pool.
:param conn:
a connection from one of our connection pools
:param timeout:
Socket timeout in seconds for the request. This can be a
float or integer, which will set the same timeout value for
the socket connect and the socket read, or an instance of
:class:`urllib3.util.Timeout`, which gives you more fine-grained
control over your timeouts.
"""
self.num_requests += 1
timeout_obj = self._get_timeout(timeout)
timeout_obj.start_connect()
conn.timeout = timeout_obj.connect_timeout
# Trigger any extra validation we need to do.
try:
self._validate_conn(conn)
except (SocketTimeout, BaseSSLError) as e:
# Py2 raises this as a BaseSSLError, Py3 raises it as socket timeout.
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
raise
# conn.request() calls http.client.*.request, not the method in
# urllib3.request. It also calls makefile (recv) on the socket.
try:
if chunked:
conn.request_chunked(method, url, **httplib_request_kw)
else:
conn.request(method, url, **httplib_request_kw)
# We are swallowing BrokenPipeError (errno.EPIPE) since the server is
# legitimately able to close the connection after sending a valid response.
# With this behaviour, the received response is still readable.
except BrokenPipeError:
# Python 3
pass
except IOError as e:
# Python 2 and macOS/Linux
# EPIPE and ESHUTDOWN are BrokenPipeError on Python 2, and EPROTOTYPE is needed on macOS
# https://erickt.github.io/blog/2014/11/19/adventures-in-debugging-a-potential-osx-kernel-bug/
if e.errno not in {
errno.EPIPE,
errno.ESHUTDOWN,
errno.EPROTOTYPE,
}:
raise
# Reset the timeout for the recv() on the socket
read_timeout = timeout_obj.read_timeout
# App Engine doesn't have a sock attr
if getattr(conn, "sock", None):
# In Python 3 socket.py will catch EAGAIN and return None when you
# try and read into the file pointer created by http.client, which
# instead raises a BadStatusLine exception. Instead of catching
# the exception and assuming all BadStatusLine exceptions are read
# timeouts, check for a zero timeout before making the request.
if read_timeout == 0:
raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % read_timeout
)
if read_timeout is Timeout.DEFAULT_TIMEOUT:
conn.sock.settimeout(socket.getdefaulttimeout())
else: # None or a value
conn.sock.settimeout(read_timeout)
# Receive the response from the server
try:
try:
# Python 2.7, use buffering of HTTP responses
httplib_response = conn.getresponse(buffering=True)
except TypeError:
# Python 3
try:
httplib_response = conn.getresponse()
except BaseException as e:
# Remove the TypeError from the exception chain in
# Python 3 (including for exceptions like SystemExit).
# Otherwise it looks like a bug in the code.
six.raise_from(e, None)
except (SocketTimeout, BaseSSLError, SocketError) as e:
> self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:447:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <urllib3.connectionpool.HTTPConnectionPool object at 0x14f5dbf70>, err = timeout('timed out')
url = '/simbad/sim-script', timeout_value = 60
def _raise_timeout(self, err, url, timeout_value):
"""Is the error actually a timeout? Will raise a ReadTimeout or pass"""
if isinstance(err, SocketTimeout):
> raise ReadTimeoutError(
self, url, "Read timed out. (read timeout=%s)" % timeout_value
)
E urllib3.exceptions.ReadTimeoutError: HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/urllib3/connectionpool.py:336: ReadTimeoutError
During handling of the above exception, another exception occurred:
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f30bfa0>
def test_query_region(self):
> result = simbad.core.Simbad.query_region(ICRS_COORDS_M42, radius=5 * u.deg,
equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:78:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/utils/class_or_instance.py:25: in f
return self.fn(obj, *args, **kwds)
astroquery/utils/process_asyncs.py:26: in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:313: in _request
response = query.request(self._session,
astroquery/query.py:66: in request
return session.request(self.method, self.url, params=self.params,
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/sessions.py:542: in request
resp = self.send(prep, **send_kwargs)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/sessions.py:655: in send
r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <requests.adapters.HTTPAdapter object at 0x14f0d1af0>, request = <PreparedRequest [POST]>, stream = False
timeout = Timeout(connect=60, read=60, total=None), verify = True, cert = None, proxies = OrderedDict()
def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
"""Sends PreparedRequest object. Returns Response object.
:param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
:param stream: (optional) Whether to stream the request content.
:param timeout: (optional) How long to wait for the server to send
data before giving up, as a float, or a :ref:`(connect timeout,
read timeout) <timeouts>` tuple.
:type timeout: float or tuple or urllib3 Timeout object
:param verify: (optional) Either a boolean, in which case it controls whether
we verify the server's TLS certificate, or a string, in which case it
must be a path to a CA bundle to use
:param cert: (optional) Any user-provided SSL certificate to be trusted.
:param proxies: (optional) The proxies dictionary to apply to the request.
:rtype: requests.Response
"""
try:
conn = self.get_connection(request.url, proxies)
except LocationValueError as e:
raise InvalidURL(e, request=request)
self.cert_verify(conn, request.url, verify, cert)
url = self.request_url(request, proxies)
self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
chunked = not (request.body is None or 'Content-Length' in request.headers)
if isinstance(timeout, tuple):
try:
connect, read = timeout
timeout = TimeoutSauce(connect=connect, read=read)
except ValueError as e:
# this may raise a string formatting error.
err = ("Invalid timeout {}. Pass a (connect, read) "
"timeout tuple, or a single float to set "
"both timeouts to the same value".format(timeout))
raise ValueError(err)
elif isinstance(timeout, TimeoutSauce):
pass
else:
timeout = TimeoutSauce(connect=timeout, read=timeout)
try:
if not chunked:
resp = conn.urlopen(
method=request.method,
url=url,
body=request.body,
headers=request.headers,
redirect=False,
assert_same_host=False,
preload_content=False,
decode_content=False,
retries=self.max_retries,
timeout=timeout
)
# Send the request.
else:
if hasattr(conn, 'proxy_pool'):
conn = conn.proxy_pool
low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
for header, value in request.headers.items():
low_conn.putheader(header, value)
low_conn.endheaders()
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
low_conn.send(b'\r\n')
low_conn.send(i)
low_conn.send(b'\r\n')
low_conn.send(b'0\r\n\r\n')
# Receive the response from the server
try:
# For Python 2.7, use buffering of HTTP responses
r = low_conn.getresponse(buffering=True)
except TypeError:
# For compatibility with Python 3.3+
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
except (ProtocolError, socket.error) as err:
raise ConnectionError(err, request=request)
except MaxRetryError as e:
if isinstance(e.reason, ConnectTimeoutError):
# TODO: Remove this in 3.0.0: see #2811
if not isinstance(e.reason, NewConnectionError):
raise ConnectTimeout(e, request=request)
if isinstance(e.reason, ResponseError):
raise RetryError(e, request=request)
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
raise ConnectionError(e, request=request)
except ClosedPoolError as e:
raise ConnectionError(e, request=request)
except _ProxyError as e:
raise ProxyError(e)
except (_SSLError, _HTTPError) as e:
if isinstance(e, _SSLError):
# This branch is for urllib3 versions earlier than v1.22
raise SSLError(e, request=request)
elif isinstance(e, ReadTimeoutError):
> raise ReadTimeout(e, request=request)
E requests.exceptions.ReadTimeout: HTTPConnectionPool(host='simbad.u-strasbg.fr', port=80): Read timed out. (read timeout=60)
/Users/bsipocz/.pyenv/versions/3.9.1/lib/python3.9/site-packages/requests/adapters.py:529: ReadTimeout
____________________________________________ TestSimbad.test_query_regions _____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14fa8aca0>
def test_query_regions(self):
> result = simbad.core.Simbad.query_region(multicoords, radius=1 * u.arcmin,
equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:83:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/utils/class_or_instance.py:25: in f
return self.fn(obj, *args, **kwds)
astroquery/utils/process_asyncs.py:26: in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/80859ab7eafc315cb9e0feb0e4feb5e9f0054ca13482077d7b96e13f.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
__________________________________________ TestSimbad.test_query_object_async __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f193670>
def test_query_object_async(self):
> response = simbad.core.Simbad.query_object_async("m [0-9]",
wildcard=True)
astroquery/simbad/tests/test_simbad_remote.py:88:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:580: in query_object_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/98985b2b94cb4481101abafc174df64c849a598617d8f5ab5ea4b743.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________________ TestSimbad.test_query_object _____________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f84e790>
def test_query_object(self):
> result = simbad.core.Simbad.query_object("m [0-9]", wildcard=True)
astroquery/simbad/tests/test_simbad_remote.py:93:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:543: in query_object
response = self.query_object_async(object_name, wildcard=wildcard,
astroquery/simbad/core.py:580: in query_object_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:311: in _request
response = query.from_cache(self.cache_location)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <astroquery.query.AstroQuery object at 0x14f84e430>
cache_location = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad'
def from_cache(self, cache_location):
request_file = self.request_file(cache_location)
try:
with open(request_file, "rb") as f:
> response = pickle.load(f)
E EOFError: Ran out of input
astroquery/query.py:109: EOFError
__________________________________________ TestSimbad.test_query_multi_object __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f2e0880>
def test_query_multi_object(self):
> result = simbad.core.Simbad.query_objects(['M32', 'M81'])
astroquery/simbad/tests/test_simbad_remote.py:97:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:607: in query_objects
return self.query_object('\n'.join(object_names), wildcard=wildcard,
astroquery/simbad/core.py:543: in query_object
response = self.query_object_async(object_name, wildcard=wildcard,
astroquery/simbad/core.py:580: in query_object_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/5eaf8ac073e372728c0a8fca514c84c5be7eb0153adde586b5a2dce3.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
___________________________________________ TestSimbad.test_query_object_ids ___________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f6739d0>
def test_query_object_ids(self):
> result = simbad.core.Simbad.query_objectids("Polaris")
astroquery/simbad/tests/test_simbad_remote.py:115:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:923: in query_objectids
response = self.query_objectids_async(object_name, cache=cache,
astroquery/simbad/core.py:954: in query_objectids_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/27044b5e19220de1939ac69ae384666cf867d641f839c22883a17cc8.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
____________________________________ TestSimbad.test_null_response[query_criteria] _____________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f315670>, function = 'query_criteria'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:489: in query_criteria
result = self.query_criteria_async(*args, **kwargs)
astroquery/simbad/core.py:516: in query_criteria_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/7b571c914e207c3c14e86bd5a404974789d62dabcf439f2ce3460313.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________ TestSimbad.test_null_response[query_object] ______________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f5c3100>, function = 'query_object'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:543: in query_object
response = self.query_object_async(object_name, wildcard=wildcard,
astroquery/simbad/core.py:580: in query_object_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/f9757a30f796adb1e2d44a1f4d22d70f6d35423b41b73efd9338e1ab.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________ TestSimbad.test_null_response[query_catalog] _____________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f161250>, function = 'query_catalog'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:743: in query_catalog
response = self.query_catalog_async(catalog, cache=cache,
astroquery/simbad/core.py:775: in query_catalog_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/c76c3e1105dae50eacd99410ee8bc71303c9e72752073845476c9783.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________ TestSimbad.test_null_response[query_bibobj] ______________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f537940>, function = 'query_bibobj'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:797: in query_bibobj
response = self.query_bibobj_async(bibcode,
astroquery/simbad/core.py:830: in query_bibobj_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/0a8c12be6a7f3314f8a3153de29984d939138fe3f138f7dfdbcbb774.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_____________________________________ TestSimbad.test_null_response[query_bibcode] _____________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f2dfd30>, function = 'query_bibcode'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:858: in query_bibcode
response = self.query_bibcode_async(bibcode, wildcard=wildcard,
astroquery/simbad/core.py:898: in query_bibcode_async
response = self._request("POST", self.SIMBAD_URL, cache=cache,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/d67ce378eaf620fbb725f2f81cf2f738135c17cd347bb76c1ada56be.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
____________________________________ TestSimbad.test_null_response[query_objectids] ____________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f58b9d0>, function = 'query_objectids'
@pytest.mark.parametrize('function', [
('query_criteria'),
('query_object'),
('query_catalog'),
('query_bibobj'),
('query_bibcode'),
('query_objectids')])
def test_null_response(self, function):
> assert (simbad.core.Simbad.__getattribute__(function)('idonotexist')
is None)
astroquery/simbad/tests/test_simbad_remote.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:923: in query_objectids
response = self.query_objectids_async(object_name, cache=cache,
astroquery/simbad/core.py:954: in query_objectids_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/e1c00f36b568763a2cc1d2b57bbe71af1363ecb80f34702c69fefd3f.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
__________________________________________ TestSimbad.test_query_objects_null __________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f1bdfa0>
def test_query_objects_null(self):
> assert simbad.core.Simbad.query_objects(['idonotexist',
'idonotexisteither']) is None
astroquery/simbad/tests/test_simbad_remote.py:135:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/simbad/core.py:607: in query_objects
return self.query_object('\n'.join(object_names), wildcard=wildcard,
astroquery/simbad/core.py:543: in query_object
response = self.query_object_async(object_name, wildcard=wildcard,
astroquery/simbad/core.py:580: in query_object_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/c1bea0f997b83a530bc7cf77deb42038b6b3a4b14d4200677f34340e.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
__________________________________________ TestSimbad.test_query_region_null ___________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f671df0>
def test_query_region_null(self):
> result = simbad.core.Simbad.query_region(
coord.SkyCoord("00h01m0.0s 00h00m0.0s"), radius="0d",
equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:140:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/utils/class_or_instance.py:25: in f
return self.fn(obj, *args, **kwds)
astroquery/utils/process_asyncs.py:26: in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/fb4fa467b6a70541d337cb42285b600f49b034d5c70a4fa69ad1ab2c.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_______________________________________ TestSimbad.test_query_small_region_null ________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f5d3c70>
def test_query_small_region_null(self):
> result = simbad.core.Simbad.query_region(
coord.SkyCoord("00h01m0.0s 00h00m0.0s"), radius=1.0 * u.marcsec,
equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:147:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/utils/class_or_instance.py:25: in f
return self.fn(obj, *args, **kwds)
astroquery/utils/process_asyncs.py:26: in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/7d2d9c9c26d1b7a1285961ce47a0ea01062cf087cee48f6d4df877b3.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
_______________________________________ TestSimbad.test_query_zero_sized_region ________________________________________
self = <astroquery.simbad.tests.test_simbad_remote.TestSimbad object at 0x14f829a60>
def test_query_zero_sized_region(self):
> result = simbad.core.Simbad.query_region(
coord.SkyCoord("20h54m05.6889s 37d01m17.380s"), radius="1s",
equinox=2000.0, epoch='J2000')
astroquery/simbad/tests/test_simbad_remote.py:154:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
astroquery/utils/class_or_instance.py:25: in f
return self.fn(obj, *args, **kwds)
astroquery/utils/process_asyncs.py:26: in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
astroquery/simbad/core.py:718: in query_region_async
response = self._request("POST", self.SIMBAD_URL, data=request_payload,
astroquery/simbad/core.py:235: in _request
response = super(SimbadBaseQuery, self)._request(*args, **kwargs)
astroquery/query.py:320: in _request
to_cache(response, query.request_file(self.cache_location))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
response = <Response [200]>
cache_file = '/var/folders/dc/hsm7tqpx2d57n7vb3k1l81xw0000gq/T/tmphtow6x6hastropy_cache/astropy/astroquery/Simbad/0ce6d386ac484ec3b2825f84ee586794cc7b27c93d3fe1c9db24b5ee.pickle'
def to_cache(response, cache_file):
log.debug("Caching data to {0}".format(cache_file))
with open(cache_file, "wb") as f:
> pickle.dump(response, f)
E AttributeError: Can't pickle local object 'post_mockreturn.<locals>.last_query'
astroquery/query.py:30: AttributeError
=============================================== short test summary info ================================================
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_criteria1 - AttributeError: Can't pickle...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_criteria2 - AttributeError: Can't pickle...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_bibcode_async - AttributeError: Can't pi...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_bibcode - EOFError: Ran out of input
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_bibobj_async - AttributeError: Can't pic...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_bibobj - EOFError: Ran out of input
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_catalog_async - AttributeError: Can't pi...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_catalog - EOFError: Ran out of input
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_region_async - requests.exceptions.ReadT...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_region_async_vector - AttributeError: Ca...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_region - requests.exceptions.ReadTimeout...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_regions - AttributeError: Can't pickle l...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_object_async - AttributeError: Can't pic...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_object - EOFError: Ran out of input
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_multi_object - AttributeError: Can't pic...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_object_ids - AttributeError: Can't pickl...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_criteria] - AttributeError...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_object] - AttributeError: ...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_catalog] - AttributeError:...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_bibobj] - AttributeError: ...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_bibcode] - AttributeError:...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_null_response[query_objectids] - AttributeErro...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_objects_null - AttributeError: Can't pic...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_region_null - AttributeError: Can't pick...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_small_region_null - AttributeError: Can'...
FAILED astroquery/simbad/tests/test_simbad_remote.py::TestSimbad::test_query_zero_sized_region - AttributeError: Can'...
================================= 26 failed, 63 passed, 1 skipped in 132.05s (0:02:12) =================================
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment