mommermi / photometrypipeline Goto Github PK
View Code? Open in Web Editor NEWautomated photometry pipeline for small to medium-sized observatories
License: GNU General Public License v3.0
automated photometry pipeline for small to medium-sized observatories
License: GNU General Public License v3.0
The documentation says that the z (and u) band is supported. But this line
photometrypipeline/pp_calibrate.py
Line 149 in b0ba1b1
checks that the observation filter (the one you are trying to calibrate) is either g, r, or i. Is the z (and u) band not really supported?
Hi Michael,
pp_run all
stops with IOError: cannot find any data...
. According to the documentation, it should walk through the directories and find all FITS files. But according to the code, -prefix
must be set in order for this to work: pp_run#L353. I'm not sure which is the correct behavior (docs or code), so I'll let you take care of it. At the same time, the call to run_the_pipeline
on line 372 is missing the source_tolerance
parameter.
Thanks,
msk
Hi, i am trying to follow the Documentation 1.0 on ubuntu 16.04 system. After install those softwares and run 'pp_run mscience*fits' for example data. It works until 'processing curve-of-growth for frame mscience0217.fits'. It shows:
File "/home/youdong/Software/anaconda2/envs/snowflakes/lib/python3.7/site-packages//astroquery/jplhorizons/core.py", line 1247, in parse_horizons
data[col].unit = column_defs[col][1]
KeyError: 'R.A.(ICRF)'
I am not sure what is the problem, do you have any idea or maybe i make some mistakes?
The latest merge breaks my installation.
Traceback (most recent call last):
File "/home/boada/Projects/photometrypipeline/pp_run", line 46, in
import pp_prepare
File "/home/boada/Projects/photometrypipeline/pp_prepare.py", line 44, in
import diagnostics as diag
File "/home/boada/Projects/photometrypipeline/diagnostics.py", line 38, in
from scipy.misc import toimage # requires Pillow
ImportError: cannot import name toimage
It seems this is not an optional requirement. Pillow is not listed as a required install in the documentation.
Calibrating the i' band it can't seem to find the associated magnitude error. I printed all of the fields in the catalog, and e_imag
is missing. See below.
- extract sources from 1 images using aperture radius 3.68px
29183 sources extracted from frame Hilt3i.fits----- run photometric calibration
(27627, 25) (sources, columns) read from Hilt3i.fits
query Vizier for SDSS-R9 at 6.979/ +64.104 in a 0.79 deg radius no data available from SDSS-R9
0 sources downloaded from SDSS-R9
query Vizier for APASS9 at 6.979/ +64.104 in a 0.79 deg radius 1309 sources retrieved.
1309 sources downloaded from APASS9
<TableColumns names=('ident','ra.deg','dec.deg','e_ra.deg','e_dec.deg','Vmag','e_Vmag','Bmag','e_Bmag','gmag','e_gmag','rmag','e_rmag','imag','i_gmag','_RAJ2000','DEJ2000')>
Traceback (most recent call last):
File "/home/boada/Projects/photometrypipeline/pp_run", line 392, in
fixed_aprad, source_tolerance)
File "/home/boada/Projects/photometrypipeline/pp_run", line 276, in run_the_pipeline
diagnostics=True)
File "/home/boada/Projects/photometrypipeline/pp_calibrate.py", line 421, in calibrate
max_sources=2e4, display=display)
File "/home/boada/Projects/photometrypipeline/pp_calibrate.py", line 124, in create_photometrycatalog
cat['e'+filtername+'mag'] > mag_accuracy)
File "/home/boada/Projects/photometrypipeline/catalog.py", line 101, in getitem
return self.data[ident]
File "/home/boada/.local/lib/python2.7/site-packages/astropy/table/table.py", line 1196, in getitem
return self.columns[item]
File "/home/boada/.local/lib/python2.7/site-packages/astropy/table/table.py", line 109, in getitem
return OrderedDict.getitem(self, item)
KeyError: 'e_imag'
If you have a field which lies just outside of a catalog footprint (see this example), PP can find stars that are near enough to the field, but not actually overlapping with your image. Basically, PP finds stars at the very edge (or just beyond) your FoV. This causes PP to "succeed" in downloading sources, but it fails to actually match any of them.
query Vizier for SDSS-R13 at 216.3/ -4 in a 0.45 deg radius 17 sources retrieved.
17 sources downloaded from SDSS-R13
17 sources with accurate magnitudes in i band
zeropoint for xxxx.ldac: Warning: 0 reference stars after source matching for frame xxxx.ldac
write calibrated data into database files
PP should then try another of the preferred catalogs, but the code isn't setup to do that right now. Line 67 of calibrate.py makes you think that it will try all of the preferred catalogs, but that isn't the case. Only a single catalog is returned, and the function is never called again.
This makes it hard to run PP in an automated fashion because it can't always find the right catalog.
In a perfect world, I'd like it to download a reference catalog, check to make sure there are enough sources, and then try to match sources. If that fails, only then, would I want it to download another reference catalog. No point in downloading a bunch of stuff you don't end up using.
This happened to me today:
query Vizier for SDSS-R9 at 281.695/ +45.777 in a 0.58 deg radius no data available from SDSS-R9
0 sources downloaded from SDSS-R9
query Vizier for APASS9 at 281.695/ +45.777 in a 0.58 deg radius no data available from APASS9
0 sources downloaded from APASS9
query Vizier for PANSTARRS at 281.695/ +45.777 in a 0.58 deg radius MAST does currently not allow for PANSTARRS catalog queries with radii larger than 0.5 deg; clip radius to 0.5 deg
And it just sits there. If it fails to download anything, it should just give up after some time.
Dear Michael,
Testing the code using example image fits (217 or 218), no *.dat is produced
and the following error occurs :
---- run image registration
extract sources from 1 frames
.....
init
raise VerifyError('\n'.join(msg))
astropy.io.fits.verify.VerifyError: The following keyword arguments to Column were invalid:
Column disp option (TDISPn) failed verification: Format E15 is not recognized. The invalid value will be ignored for the purpose of formatting the data in this column.
----- derive optimium photometry aperture
Hi Michael,
pp_run all descends into the .diagnostics directory and crashes:
RUN PIPELINE IN /disks/data0/data/lowell/data/20160212/phot/c2013a1/SDSS-R/.diagnostics
ERROR: cannot open file C_2013_A1_(Siding_Spring).gif
ERROR: cannot open file curveofgrowth.dat
Traceback (most recent call last):
File "/home/msk/local/photometrypipeline/pp_run", line 390, in <module>
fixed_aprad, source_tolerance)
File "/home/msk/local/photometrypipeline/pp_run", line 114, in run_the_pipeline
'_pp_conf.instrument_keys accordingly')
KeyError: 'cannot identify telescope/instrument; please update_pp_conf.instrument_keys accordingly'
Maybe this directory name should always be skipped?
Hi there:
I installed your photometrypipline and ran into some issues that leads me to believe there's some code issues in the repository:
$ pp_run all
Traceback (most recent call last):
File "/home/gtulloch/Dropbox/Astronomy/Projects/photometrypipeline/pp_run", line 402, in
diag.create_summary()
AttributeError: 'Registration_Diagnostics' object has no attribute 'create_summary'
If I go into the code and comment out that line, the code starts running through my data directory but dies on the following:
$ pp_run all
NOTHING TO DO IN /home/gtulloch/Dropbox/Astronomy/Projects/photometrypipeline/data
NOTHING TO DO IN /home/gtulloch/Dropbox/Astronomy/Projects/photometrypipeline/data/DXAND
RUN PIPELINE IN /home/gtulloch/Dropbox/Astronomy/Projects/photometrypipeline/data/DXAND/B
Traceback (most recent call last):
File "/home/gtulloch/Dropbox/Astronomy/Projects/photometrypipeline/pp_run", line 426, in
rerun_registration, asteroids)
TypeError: run_the_pipeline() missing 1 required positional argument: 'keep_wcs'
This seems like there's some code missing in the repository? I've attached the log but seems like it's not much help.
Thanks very much for making this available, if I can get it working it saves me a huge amount of work!
Regards,
Gord
I found out that making code PEP8 compliant is perfect for a not-so-interesting meeting. Hence, all functions will be gradually made PEP8 compliant. Please refer to the wiki for a list of functions that have been worked on already.
Hi,
As commented on bug #48 , that issue has never been fixed and remains to this day. I think you missed my comment there and as I'm unable to reopen it myself, I decided to write a new issue for that purpose.
Cheers!
there is a typo in the gitignore file. Should be mytelescopes.py
not mytelescopes.pyc
as all *.pyc
files are already untracked.
When executed under python 3.7.5 using astropy 4.1 formatting the a frame midtime using iso failed, but to_value can be used instead.
Hi,
I'm trying to use photometrypipeline for analyzing an asteroid lightcurve (total of 195 observations taken with the NOT) and something seems to be going horribly wrong; as far as I can tell, the program does not manage to match any images with the Gaia catalogue and ends up crashing. Flat-field and bias corrections have been applied to the images and the edges of the filter have been trimmed out.
I'm not sure on what debug information you exactly need, but
here's the terminal output of pp_run *.fits &> pp_log.txt
(I notice a lot of NaN and zero values here?)
and here is the 'LOG' file outputted in the data folder.
I'm using the commits of #47 (to actually identify the correct filter) on top of latest master.
Do you have any pointers on how to get around this? Let me know if you need more information (or example data files)
I'll post this here for completeness.
On line
Line 197 in bc33b68
See http://spiff.rit.edu/classes/phys440/lectures/coords/coords.html for example.
Does anyone has the list of instructions for installing the above two softwares? I am struggling to install them in Ubuntu. Any help would be appreciated. Thanks
Photometric calibration fails on the z' band.
----- run photometric calibration
(26152, 25) (sources, columns) read from Hilt3z.fits
query Vizier for SDSS-R9 at 6.988/ +64.096 in a 0.82 deg radius no data available from SDSS-R9
0 sources downloaded from SDSS-R9
query Vizier for APASS9 at 6.988/ +64.096 in a 0.82 deg radius 1403 sources retrieved.
1403 sources downloaded from APASS9
ERROR: no transformation from APASS9 to z available
Traceback (most recent call last):
File "/home/boada/Projects/photometrypipeline/pp_run", line 392, in
fixed_aprad, source_tolerance)
File "/home/boada/Projects/photometrypipeline/pp_run", line 276, in run_the_pipeline
diagnostics=True)
File "/home/boada/Projects/photometrypipeline/pp_calibrate.py", line 421, in calibrate
max_sources=2e4, display=display)
File "/home/boada/Projects/photometrypipeline/pp_calibrate.py", line 107, in create_photometrycatalog
cat['e'+filtername+'mag'] > mag_accuracy)
File "/home/boada/Projects/photometrypipeline/catalog.py", line 101, in getitem
return self.data[ident]
File "/home/boada/.local/lib/python2.7/site-packages/astropy/table/table.py", line 1196, in getitem
return self.columns[item]
File "/home/boada/.local/lib/python2.7/site-packages/astropy/table/table.py", line 109, in getitem
return OrderedDict.getitem(self, item)
KeyError: '_e_zmag'
I don't know if this is an issue with PP or with SCAMP. After the pipeline has completed and all of the header keywords have been updated, the values are often given as strings. For example:
PV1_0 = '-6.07203233558e-06' / / Projection distortion parameter
PV1_1 = '1.00009356016' / / Projection distortion parameter
PV1_2 = '0.000109062147791' / / Projection distortion parameter
PV1_4 = '-0.000165347506398' / / Projection distortion parameter
PV1_5 = '0.00059124044575' / / Projection distortion parameter
PV1_6 = '-0.000106843007282' / / Projection distortion parameter
PV2_0 = '-3.9237160343e-05' / / Projection distortion parameter
PV2_1 = '1.00004779852' / / Projection distortion parameter
PV2_2 = '-0.000114143177142' / / Projection distortion parameter
PV2_4 = '0.00091173019759' / / Projection distortion parameter
PV2_5 = '-0.000173258009464' / / Projection distortion parameter
PV2_6 = '0.000342091928913' / / Projection distortion parameter
If you wanna do something like run Sextractor on the image with everything all updated, it works fine. However, Sextractor gives the coordinates of the objects in X_IMAGE
and Y_IMAGE
so you'll need to convert those into RA
and DEC
using the WCS of the image.
Ok. So you fire up astropy
and get a copy of the header/WCS information, but because the values of the keywords are strings and not floating point numbers astropy
doesn't know what to do and the conversion from X_IMAGE
to RA
fails. This is probably something that should be changed in astropy, but if it is an easy fix in PP then we could do that too.
I'm not sure where the header keywords are getting written by PP or if the problem is really something with SCAMP and I'll need to write a work around.
in pp_run.py
line 118, the variable instruments
is spelled instruemnts
which will lead to a NameError
and breakage if pp_run
is used on fits files from multiple instruments.
I just run the test data with command 'pp_run mscience021*.fits', but I got the following errors, I could not figure out what is the problem, please help me.
processing curve-of-growth for frame mscience0217.fits
Traceback (most recent call last):
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connection.py", line 160, in _new_conn
(self._dns_host, self.port), self.timeout, **extra_kw
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/util/connection.py", line 84, in create_connection
raise err
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/util/connection.py", line 74, in create_connection
sock.connect(sa)
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py", line 677, in urlopen
chunked=chunked,
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py", line 381, in _make_request
self._validate_conn(conn)
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py", line 976, in _validate_conn
conn.connect()
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connection.py", line 308, in connect
conn = self._new_conn()
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connection.py", line 167, in _new_conn
% (self.host, self.timeout),
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x7fda3cfd06d0>, 'Connection to ssd.jpl.nasa.gov timed out. (connect timeout=30)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/yun/anaconda3/lib/python3.7/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/connectionpool.py", line 725, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/home/yun/anaconda3/lib/python3.7/site-packages/urllib3/util/retry.py", line 439, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='ssd.jpl.nasa.gov', port=443): Max retries exceeded with url: /horizons_batch.cgi?batch=1&TABLE_TYPE=OBSERVER&QUANTITIES=%271%2C2%2C3%2C4%2C5%2C6%2C7%2C8%2C9%2C10%2C11%2C12%2C13%2C14%2C15%2C16%2C17%2C18%2C19%2C20%2C21%2C22%2C23%2C24%2C25%2C26%2C27%2C28%2C29%2C30%2C31%2C32%2C33%2C34%2C35%2C36%2C37%2C38%2C39%2C40%2C41%2C42%2C43%27&COMMAND=%223552%3B%22&SOLAR_ELONG=%220%2C180%22&LHA_CUTOFF=0&CSV_FORMAT=YES&CAL_FORMAT=BOTH&ANG_FORMAT=DEG&APPARENT=AIRLESS&REF_SYSTEM=J2000&EXTRA_PREC=NO&CENTER=%27290%27&TLIST=2457786.956633669&SKIP_DAYLT=NO (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7fda3cfd06d0>, 'Connection to ssd.jpl.nasa.gov timed out. (connect timeout=30)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/yun/photometrypipeline/pp_run", line 435, in
rerun_registration, asteroids, keep_wcs)
File "/home/yun/photometrypipeline/pp_run", line 249, in run_the_pipeline
diagnostics=True)
File "/home/yun/photometrypipeline/pp_photometry.py", line 381, in photometry
diagnostics=diagnostics)
File "/home/yun/photometrypipeline/pp_photometry.py", line 134, in curve_of_growth_analysis
eph = obj.ephemerides()
File "/home/yun/.local/lib/python3.7/site-packages/astroquery/utils/class_or_instance.py", line 25, in f
return self.fn(obj, *args, **kwds)
File "/home/yun/.local/lib/python3.7/site-packages/astroquery/utils/process_asyncs.py", line 26, in newmethod
response = getattr(self, async_method_name)(*args, **kwargs)
File "/home/yun/.local/lib/python3.7/site-packages/astroquery/jplhorizons/core.py", line 599, in ephemerides_async
timeout=self.TIMEOUT, cache=cache)
File "/home/yun/.local/lib/python3.7/site-packages/astroquery/query.py", line 263, in _request
json=json)
File "/home/yun/.local/lib/python3.7/site-packages/astroquery/query.py", line 71, in request
json=json)
File "/home/yun/anaconda3/lib/python3.7/site-packages/requests/sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "/home/yun/anaconda3/lib/python3.7/site-packages/requests/sessions.py", line 643, in send
r = adapter.send(request, **kwargs)
File "/home/yun/anaconda3/lib/python3.7/site-packages/requests/adapters.py", line 504, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='ssd.jpl.nasa.gov', port=443): Max retries exceeded with url: /horizons_batch.cgi?batch=1&TABLE_TYPE=OBSERVER&QUANTITIES=%271%2C2%2C3%2C4%2C5%2C6%2C7%2C8%2C9%2C10%2C11%2C12%2C13%2C14%2C15%2C16%2C17%2C18%2C19%2C20%2C21%2C22%2C23%2C24%2C25%2C26%2C27%2C28%2C29%2C30%2C31%2C32%2C33%2C34%2C35%2C36%2C37%2C38%2C39%2C40%2C41%2C42%2C43%27&COMMAND=%223552%3B%22&SOLAR_ELONG=%220%2C180%22&LHA_CUTOFF=0&CSV_FORMAT=YES&CAL_FORMAT=BOTH&ANG_FORMAT=DEG&APPARENT=AIRLESS&REF_SYSTEM=J2000&EXTRA_PREC=NO&CENTER=%27290%27&TLIST=2457786.956633669&SKIP_DAYLT=NO (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7fda3cfd06d0>, 'Connection to ssd.jpl.nasa.gov timed out. (connect timeout=30)'))
Your email in the README file is wrong.
Could example files be included with the distribution? If I want to see the code in action, I have to configure everything to work with some of my own imaging data. It'd be nice to see the code run out of the box.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.