Publications
Journal Articles
• A. K. Taras, N. Suenderhauf, P. Corke, and D. G. Dansereau, “Inherently privacy-preserving vision for trustworthy autonomous systems: Needs and solutions,” Journal of Responsible Technology, vol. 17, p. 100079, 2024. Available here.
• A. Ravendran, M. Bryson, and D. G. Dansereau, “BuFF: Burst feature finder for light-constrained 3D reconstruction,” Robotics and Automation Letters (RA-L) and Conference on Robotics and Automation (ICRA), 2023. Available here.
• T. Coppin, D. W. Palmer, K. Rana, D. G. Dansereau, M. J. Collins, D. A. Atchison, J. Roberts, R. Crawford, and A. Jaiprakash, “Design of a focused light field fundus camera for retinal imaging,” Signal Processing: Image Communication, p. 116869, 2022.
• T. Frizza, D. G. Dansereau, N. M. Seresht, and M. Bewley, “Semantically accurate super-resolution generative adversarial networks,” Computer Vision and Image Understanding (CVIU), p. 103464, 2022. Available here.
• A. Ravendran, M. Bryson, and D. G. Dansereau, “Burst imaging for light-constrained structure-from-motion,” IEEE Robotics and Automation Letters (RA-L, ICRA), vol. 7, no. 2, pp. 1040–1047, Apr. 2022. Available here.
• D. Tsai, P. Corke, T. Peynot, and D. G. Dansereau, “Refractive light-field features for curved transparent objects in structure from motion,” IEEE Robotics and Automation Letters (RA-L, IROS), vol. 6, no. 4, Jun. 2021. Available here.
• T. Wang and D. G. Dansereau, “Multiplexed illumination for classifying visually similar objects,” Applied Optics, vol. 60, no. 10, pp. B23–B31, Apr. 2021. Available here.
• S. K. Gullapalli, C. U. S. Edussooriya, C. Wijenayake, D. G. Dansereau, L. T. Bruton, and A. Madanayake, “Wave-digital filter circuits for single-chip 4-D light field depth-based enhancement,” Multidimensional Systems and Signal Processing (MSSP), 2021. Available here.
• C. Edussooriya, C. Wijenayake, A. Madanayake, N. Liyanage, S. Premaratne, J. Vorhies, D. G. Dansereau, P. Agathoklis, and L. Bruton, “Real-time light field signal processing using 4D/5D digital filter FPGA circuits,” IEEE Transactions on Circuits and Systems II: Express Briefs, 2021. Available here.
• G. M. Schuster, D. G. Dansereau, G. Wetzstein, and J. E. Ford, “Panoramic single-aperture multi-sensor light field camera,” Optics Express, vol. 27, no. 26, pp. 37257–37273, 2019. Available here.
• D. Tsai, D. G. Dansereau, T. Peynot, and P. Corke, “Distinguishing refracted features using light field cameras with application to structure from motion,” IEEE Robotics and Automation Letters (RA-L, ICRA), vol. 4, no. 2, pp. 177–184, Apr. 2019. Available here.
• D. W. Palmer, T. Coppin, K. Rana, D. G. Dansereau, M. Suheimat, M. Maynard, D. Atchison, J. Roberts, R. Crawford, and A. Jaiprakash, “Glare-free retinal imaging using a portable light field fundus camera,” Biomedical Optics Express, 2018.
• D. L. Bongiorno, M. Bryson, T. Bridge, D. G. Dansereau, and S. B. Williams, “Coregistered hyperspectral and stereo image seafloor mapping from an autonomous underwater vehicle,” Journal of Field Robotics (JFR), vol. 35, no. 3, pp. 312–329, 2018. Available here.
• D. G. Dansereau*, R. Konrad*, A. Masood, and G. Wetzstein, “SpinVR: Towards live-streaming 3D virtual reality video,” ACM Transactions on Graphics (TOG), SIGGRAPH ASIA, vol. 36, no. 6, Nov. 2017. Available here.
• D. Tsai, D. G. Dansereau, T. Peynot, and P. Corke, “Image-based visual servoing with light field cameras,” IEEE Robotics and Automation Letters (RA-L), vol. 2, no. 2, Apr. 2017. Available here.
• D. G. Dansereau, S. B. Williams, and P. I. Corke, “Simple change detection from mobile light field cameras,” Computer Vision and Image Understanding (CVIU), vol. 145C, pp. 160–171, 2016. Available here.
• D. G. Dansereau, O. Pizarro, and S. B. Williams, “Linear volumetric focus for light field cameras,” ACM Transactions on Graphics (TOG), Presented at SIGGRAPH 2015, vol. 34, no. 2, p. 15, Feb. 2015. Available here.
• C. U. S. Edussooriya, D. G. Dansereau, L. T. Bruton, and P. Agathoklis, “Five-dimensional (5-D) depth-velocity filtering for enhancing moving objects in light field videos,” IEEE Transactions on Signal Processing (TSP), vol. 63, no. 8, pp. 2151–2163, Apr. 2015. Available here.
• D. G. Dansereau, N. Brock, and J. R. Cooperstock, “Predicting an orchestral conductor’s baton movements using machine learning,” Computer Music Journal, vol. 37, no. 2, pp. 28–45, 2013. Available here.
• A. Madanayake, R. Wimalagunaratne, D. G. Dansereau, R. J. Cintra, and L. T. Bruton, “VLSI architecture for 4-D depth filtering,” Signal, Image and Video Processing, pp. 1–10, Jul. 2013. Available here.
• R. Wimalagunarathne, C. Wijenayake, A. Madanayake, D. G. Dansereau, and L. T. Bruton, “Integral form 4-D light field filters using Xilinx FPGAs and 45 nm CMOS technology,” Multidimensional Systems and Signal Processing (MSSP), 2013. Available here.
• D. G. Dansereau and L. T. Bruton, “A 4-D dual-fan filter bank for depth filtering in light fields,” IEEE Transactions on Signal Processing (TSP), vol. 55, no. 2, pp. 542–549, 2007. Available here.
Scientific Magazine Article
• C. Roman, G. Inglis, I. Vaughn, C. Smart, D. G. Dansereau, D. Bongiorno, M. Johnson-Roberson, and M. Bryson, “New tools and methods for precision sea floor mapping,” New Frontiers in Ocean Exploration: The E/V Nautilus 2012 Field Season and Summary of Mediterranean Exploration, Oceanography, vol. 26, nos. 1, supplement, pp. 10–15, Mar. 2013. Available here.
• A. Madanayake, C. Wijenayake, D. G. Dansereau, T. K. Gunaratne, L. T. Bruton, and S. B. Williams, “Multidimensional (MD) circuits and systems for emerging applications including cognitive radio, radio astronomy, robot vision and imaging,” Circuits and Systems Magazine, vol. 13, no. 1, pp. 10–43, 2013. Available here.
Patents
• A. Jaiprakash, D. Palmer, D. G. Dansereau, T. Coppin, K. Rana, J. Roberts, and R. Crawford, “Ophthalmic imaging apparatus and system.” May-2023.
• M. Hafed, D. G. Dansereau, G. Duerden, S. Laberge, Y. Nazon, and C. Tam, “System and method for physical-layer testing of high-speed serial links in their mission environments.” Aug-2008.
Fully Reviewed Conference and Workshop Papers
• C. Yan and D. G. Dansereau, “TaCOS: Task-specific camera optimization with simulation,” in Winter Conference on Applications of Computer Vision (WACV), 2025. Available here.
• J. Wilkinson, J. Naylor, R. Griffiths, and D. G. Dansereau, “Adaptive keyframe selection for online iterative NeRF construction,” in International Conference on Robotics and Automation Workshop on Neural Fields in Robotics (ICRA:RoboNerF), 2024. Available here.
• R. Griffiths, J. Naylor, and D. G. Dansereau, “NOCaL: Calibration-free semi-supervised learning of odometry and camera intrinsics,” in Robotics and Automation (ICRA), 2023. Available here.
• A. Taras and D. G. Dansereau, “Hyperbolic view dependency for all-in-focus time of flight fields,” in Australasian Conference on Robotics and Automation (ACRA), 2022. Available here.
• S. T. Digumarti, J. Daniel, A. Ravendran, R. Griffiths, and D. G. Dansereau, “Unsupervised learning of depth estimation and visual odometry for sparse light field cameras,” in Intelligent Robots and Systems (IROS), 2021. Available here.
• D. G. Dansereau, B. Girod, and G. Wetzstein, “LiFF: Light field features in scale and depth,” in Computer Vision and Pattern Recognition (CVPR), 2019. Available here.
• V. Varghese, D. G. Dansereau, M. Bryson, O. Pizarro, and S. B. Williams, “Light field image restoration for vision in scattering media,” in Image Processing (ICIP), 2018. Available here.
• A. Stewart and D. G. Dansereau, “Using planar point correspondence to calibrate camera arrays for light field acquisition,” in Australasian Conference on Robotics and Automation (ACRA), 2017. Available here.
• D. G. Dansereau, G. Schuster, J. Ford, and G. Wetzstein, “A wide-field-of-view monocentric light field camera,” in Computer Vision and Pattern Recognition (CVPR), 2017, pp. 3757–3766. Available here.
• D. G. Dansereau, A. Eriksson, and J. Leitner, “Richardson-lucy deblurring for moving light field cameras,” CVPR workshop on Light Fields for Computer Vision (CVPR:LF4CV), Jul. 2017. Available here.
• G. M. Schuster, I. P. Agurok, J. E. Ford, D. G. Dansereau, and G. Wetzstein, “Panoramic monocentric light field camera,” in Intl. Optical Design Conference (IODC), 2017.
• T. Hojnik, R. Lee, D. G. Dansereau, and J. Leitner, “Designing a robotic hopping cube for lunar exploration,” in Australasian Conference on Robotics and Automation (ACRA), 2016. Available here.
• H. Lu, Y. Li, X. Xu, L. He, Y. Li, D. G. Dansereau, and S. Serikawa, “Underwater image descattering and quality assessment,” in Image Processing (ICIP), 2016. Available here.
• D. G. Dansereau, S. P. N. Singh, and J. Leitner, “Interactive computational imaging for deformable object analysis,” in Robotics and Automation (ICRA), 2016. Available here.
• J. Leitner, W. Chamberlain, D. G. Dansereau, M. Dunbabin, M. Eich, T. Peynot, J. Roberts, R. Russell, and N. Sünderhauf, “LunaRoo: Designing a hopping lunar science payload,” in IEEE Aerospace Conference, 2016. Available here.
• D. G. Dansereau, D. Wood, S. Montabone, and S. B. Williams, “Exploiting parallax in panoramic capture to construct light fields,” in Australasian Conference on Robotics and Automation (ACRA), 2014. Available here.
• S. B. Williams, O. Pizarro, A. Friedman, M. Bryson, D. G. Dansereau, and N. N. Vatani, “Autonomous benthic monitoring – the Australian experience so far,” in Marine Imaging Workshop, 2014.
• D. G. Dansereau, O. Pizarro, and S. B. Williams, “Decoding, calibration and rectification for lenselet-based plenoptic cameras,” in Computer Vision and Pattern Recognition (CVPR), 2013, pp. 1027–1034. Available here.
• D. G. Dansereau, D. L. Bongiorno, O. Pizarro, and S. B. Williams, “Light field image denoising using a linear 4D frequency-hyperfan all-in-focus filter,” in Proceedings SPIE Computational Imaging XI, 2013, p. 86570P. Available here.
• D. L. Bongiorno, M. Bryson, D. G. Dansereau, S. B. Williams, and O. Pizarro, “Spectral characterization of COTS RGB cameras using a linear variable edge filter,” in Proceedings SPIE Digital Photography IX, 2013, p. 86600N. Available here.
• O. Pizarro, S. B. Williams, M. V. Jakuba, M. Johnson-Roberson, I. Mahon, M. Bryson, D. Steinberg, A. Friedman, D. G. Dansereau, N. Nourani-Vatani, D. Bongiorno, M. Bewley, A. Bender, N. Ashan, and B. Douillard, “Benthic monitoring with robotic platforms – the experience of Australia,” in Intl. Underwater Technology Symposium, 2013, pp. 1–10. Available here.
• A. Madanayake, R. Wimalagunaratne, D. G. Dansereau, and L. T. Bruton, “A systolic-array architecture for first-order 4-D IIR frequency-planar digital filters,” in Intl. Symposium on Circuits and Systems (ISCAS), 2012, pp. 3069–3072. Available here.
• A. Madanayake, R. Wimalagunaratne, D. G. Dansereau, and L. T. Bruton, “Design and FPGA-implementation of 1st-order 4D IIR frequency-hyperplanar digital filters,” in Midwest Symposium on Circuits and Systems (MWSCAS), 2011. Available here.
• D. G. Dansereau, I. Mahon, O. Pizarro, and S. B. Williams, “Plenoptic flow: Closed-form visual odometry for light field cameras,” in Intelligent Robots and Systems (IROS), 2011, pp. 4455–4462. Available here.
• D. G. Dansereau and S. B. Williams, “Seabed modeling and distractor extraction for mobile AUVs using light field filtering,” in Robotics and Automation (ICRA), 2011, pp. 1634–1639. Available here.
• D. G. Dansereau and L. T. Bruton, “Gradient-based depth estimation from 4D light fields,” in Intl. Symposium on Circuits and Systems (ISCAS), 2004, vol. 3, pp. 549–552. Available here.
• D. G. Dansereau and L. T. Bruton, “A 4D frequency-planar IIR filter and its application to light field processing,” in Intl. Symposium on Circuits and Systems (ISCAS), 2003, vol. 4, pp. 476–479. Available here.
• N. Chan, D. G. Dansereau, B. Davis, and B. Davies, “VHF impulse response measurements at 40 MHz,” in Proceedings of Wireless 2000, 2000, vol. 1, pp. 133–145.
Workshops and Extended Abstracts
• R. Mishra, J. Naylor, N. H. Barbara, and D. G. Dansereau, “Appearance-aware trajectory optimisation for autonomous on-orbit inspection,” in International Symposium on Artificial Intelligence, Robotics and Automation in Space (I-Sairas), 2024.
• J. Guinane, S. Alshammari, T. Bailey, X. Chen, D. G. Dansereau, V. Ila, J. Mehami, S. Sukkarieh, W. Thorp, N. Wallace, S. Williams, J. Wu, Z. Xie, J. Zhou, A. Barton, and X. Wu, “Robotic satellite for in-orbit servicing, assembly, and manufacturing (isam): Design, development, and initial results,” in International Symposium on Artificial Intelligence, Robotics and Automation in Space (I-Sairas), 2024.
• N. Munasinghe, C. L. Gentil, J. Naylor, M. Asavkin, D. G. Dansereau, and T. Vidal-Calleja, “Towards event-based satellite docking: A photometrically accurate low-earth orbit hardware simulation,” in International Conference on Robotics and Automation Workshop on Heterogeneous Multi-Robot Cooperation for Exploration & Science in Extreme Environments (ICRA:HERMES), 2024. Available here.
• A. K. Taras, N. Suenderhauf, P. Corke, and D. G. Dansereau, “The need for inherently privacy-preserving vision in trustworthy autonomous systems,” in International Conference on Robotics and Automation Workshops: Multidisciplinary Approaches to Co-Creating Trustworthy Autonomous Systems (ICRA: MACTAS), 2023. Available here.
• D. G. Dansereau, “Is it possible to design optimal cameras for robotic vision?” Dagstuhl Seminar on Hyperspectral, Multispectral, and Multimodal (HMM) Imaging: Acquisition, Algorithms, and Applications, vol. 1, no. 2, p. 22, 2017.
• D. G. Dansereau, S. B. Williams, and P. I. Corke, “Closed-form change detection from moving light field cameras,” in IROS Workshop on Alternative Sensing for Robotic Perception, 2015.
• J. Leitner, D. G. Dansereau, S. Shirazi, and P. Corke, “The need for more dynamic and active datasets,” in CVPR Workshop on the Future of Datasets in Computer Vision, 2015.
• A. Mallios, O. Pizarro, J. S. Arey, S. Samanipour, B. De Mol, N. Hurtós, M. Johnson-Roberson, D. G. Dansereau, L. Toohey, U. Lemmin, and R. Camilli, “Synoptic identification of greenhouse gas sources and sinks in lake Léman.” Granada, Spain, Feb-2015.
• O. Pizarro, S. Williams, M. Johnson-Roberson, M. Bryson, A. Friedman, D. G. Dansereau, and D. Rao, “Developments in sampling tools and techniques – a machine-centric viewpoint,” in Geohab Workshop, 2014.
• O. Pizarro, M. Jakuba, N. Flemming, D. Sakellariou, J. Henderson, M. Johnson-Roberson, I. Mahon, L. Toohey, D. G. Dansereau, and C. Lees, “AUV-assisted characterization of beachrock formations in Vatika Bay and Laconia and Peloponnese and Greece and their relevance to local sea level changes and bronze age settlements,” in Ocean Sciences Meeting, 2012.
• D. G. Dansereau, “Improved predistortion for harmonic upconversion in radio-on-fibre systems,” TRLabs Technology Forum, 1999.
Theses
• D. G. Dansereau, “Plenoptic signal processing for robust vision in field robotics,” PhD thesis, Australian Centre for Field Robotics, School of Aerospace, Mechanical; Mechatronic Engineering, The University of Sydney, 2014. Available here.
• D. G. Dansereau, “4D light field processing and its application to computer vision,” Master’s thesis, Electrical; Computer Engineering, University of Calgary, 2003. Available here.
Other
• H. Lu, J. Guna, and D. G. Dansereau, “Introduction to the special section on artificial intelligence and computer vision,” Computers & Electrical Engineering, vol. 100, no. 58, pp. 444–446, 2017. Available here.
• D. G. Dansereau, G. Schuster, J. Ford, and G. Wetzstein, “A wide-field-of-view monocentric light field camera,” in Computational Photography (ICCP) Poster, 2017.
• D. Tsai, D. G. Dansereau, S. Martin, and P. Corke, “Mirrored light field video camera adapter,” Queensland University of Technology, Dec. 2016. Available here.
• D. G. Dansereau, D. L. Bongiorno, M. Bryson, O. Pizarro, and S. B. Williams, “On the feasibility of multispectral contrast enhancement for aerial detection of sharks,” Australian Centre for Field Robotics, School of Aerospace, Mechanical; Mechatronic Engineering, The University of Sydney, Feb. 2014.
In Preparation
• D. W. Palmer, R. Griffiths, and D. G. Dansereau, “Principals and pupils of lenslet-based light field camera calibration,” under review, 2024.
• C. J. Galappaththige, J. Lai, L. Windrim, D. G. Dansereau, N. Suenderhauf, and D. Miller, “Multi-view pose-agnostic change localization with zero labels,” under review, 2024. Available here.
• N. Goncharov and D. G. Dansereau, “Segment anything in light fields for real-time applications via constrained prompting,” under review, 2024. Available here.
• C. L. Gentil, J. Naylor, N. Munasinghe, J. Mehami, B. Dai, M. Asavkin, D. G. Dansereau, and T. Vidal-Calleja, “Mixing data-driven and geometric models for satellite docking port state estimation using an RGB or event camera,” under review, 2024. Available here.
• R. Griffiths and D. G. Dansereau, “Adapting convnets for new cameras without retraining,” under review, 2024. Available here.
• J. Naylor, V. Ila, and D. G. Dansereau, “Surf-NeRF: Surface regularised neural radiance fields,” under review, 2024. Available here.
• A. Ravendran, M. Bryson, and D. G. Dansereau, “LBurst: Learning-based robotic burst feature extraction for 3D reconstruction in low light,” under review, 2024. Available here.