Over the past few years, Australian microscopy facilities have been investing heavily in new-generation high-kV electron microscopy, super-resolution imaging, high-resolution block-face imaging and correlative multimodal microscopy instruments. Although recent technological developments in instrumentation have paved the way for unprecedented scientific advances, the ever-increasing amounts of data produced have posed substantial challenges as to how workflows can be organised and managed from the point of data capture to storage. In this study, a range of academic facilities in Australia and overseas that operated, or were planning to operate, electron microscopy and correlative light–electron microscopy instruments that generate large volumes of data, were interviewed. General trends, tools, procedures, gaps and challenges across all or most of the facilities were identified in four key areas: data movement (including network configuration and capabilities), data processing (processing software packages as well as the supporting processing infrastructure), data management (including data documentation using metadata) and data orchestration (that is the overarching automated process spanning from data capture at the instrument to data management for longer-term storage, including provisions for discoverability and accessibility). Overall, this extensive survey has allowed to draw the current landscape in informatics and data management in the field. Although aspects such as researchers’ training or detailed modality-specific techniques and algorithms are important, they were not directly addressed under this review as the focus was on the IT infrastructure challenge of big-data-producing instruments. This study was undertaken under the Australian Characterisation Commons at Scale (ACCS) project.