• Login
    View Item 
    •   Home
    • UA Faculty Research
    • UA Faculty Publications
    • View Item
    •   Home
    • UA Faculty Research
    • UA Faculty Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of UA Campus RepositoryCommunitiesTitleAuthorsIssue DateSubmit DateSubjectsPublisherJournalThis CollectionTitleAuthorsIssue DateSubmit DateSubjectsPublisherJournal

    My Account

    LoginRegister

    About

    AboutUA Faculty PublicationsUA DissertationsUA Master's ThesesUA Honors ThesesUA PressUA YearbooksUA CatalogsUA Libraries

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Identification of tidal features in deep optical galaxy images with convolutional neural networks

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    stad750.pdf
    Size:
    1.668Mb
    Format:
    PDF
    Description:
    Final Published Version
    Download
    Author
    Domínguez, Sánchez, H.
    Martin, G.
    Damjanov, I.
    Buitrago, F.
    Huertas-Company, M.
    Bottrell, C.
    Bernardi, M.
    Knapen, J.H.
    Vega-Ferrero, J.
    Hausen, R.
    Kado-Fong, E.
    Población-Criado, D.
    Souchereau, H.
    Leste, O.K.
    Robertson, B.
    Sahelices, B.
    Johnston, K.V.
    Show allShow less
    Affiliation
    Steward Observatory, University of Arizona
    Issue Date
    2023-03-22
    Keywords
    galaxies: interactions
    galaxies: structure
    methods: observational
    software: development
    
    Metadata
    Show full item record
    Publisher
    Oxford University Press
    Citation
    H Domínguez Sánchez, G Martin, I Damjanov, F Buitrago, M Huertas-Company, C Bottrell, M Bernardi, J H Knapen, J Vega-Ferrero, R Hausen, E Kado-Fong, D Población-Criado, H Souchereau, O K Leste, B Robertson, B Sahelices, K V Johnston, Identification of tidal features in deep optical galaxy images with convolutional neural networks, Monthly Notices of the Royal Astronomical Society, Volume 521, Issue 3, May 2023, Pages 3861–3872, https://doi.org/10.1093/mnras/stad750
    Journal
    Monthly Notices of the Royal Astronomical Society
    Rights
    © 2023 The Author(s) Published by Oxford University Press on behalf of Royal Astronomical Society.
    Collection Information
    This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.
    Abstract
    Interactions between galaxies leave distinguishable imprints in the form of tidal features, which hold important clues about their mass assembly. Unfortunately, these structures are difficult to detect because they are low surface brightness features, so deep observations are needed. Upcoming surveys promise several orders of magnitude increase in depth and sky coverage, for which automated methods for tidal feature detection will become mandatory. We test the ability of a convolutional neural network to reproduce human visual classifications for tidal detections. We use as training ∼6000 simulated images classified by professional astronomers. The mock Hyper Suprime Cam Subaru (HSC) images include variations with redshift, projection angle, and surface brightness (μlim = 26-35 mag arcsec-2). We obtain satisfactory results with accuracy, precision, and recall values of Acc = 0.84, P = 0.72, and R = 0.85 for the test sample. While the accuracy and precision values are roughly constant for all surface brightness, the recall (completeness) is significantly affected by image depth. The recovery rate shows strong dependence on the type of tidal features: we recover all the images showing shell features and 87 per cent of the tidal streams; these fractions are below 75 per cent for mergers, tidal tails, and bridges. When applied to real HSC images, the performance of the model worsens significantly. We speculate that this is due to the lack of realism of the simulations, and take it as a warning on applying deep learning models to different data domains without prior testing on the actual data. © 2023 The Author(s) Published by Oxford University Press on behalf of Royal Astronomical Society.
    Note
    Immediate access
    ISSN
    0035-8711
    DOI
    10.1093/mnras/stad750
    Version
    Final Published Version
    ae974a485f413a2113503eed53cd6c53
    10.1093/mnras/stad750
    Scopus Count
    Collections
    UA Faculty Publications

    entitlement

     
    The University of Arizona Libraries | 1510 E. University Blvd. | Tucson, AZ 85721-0055
    Tel 520-621-6442 | repository@u.library.arizona.edu
    DSpace software copyright © 2002-2017  DuraSpace
    Quick Guide | Contact Us | Send Feedback
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.