The rise of computational images: The role of star-targeting spectroscopy
by Francesco Giarrusso
Introduction
This article explores the role of stellar spectroscopy, particularly the contributions of Angelo Secchi in the second half of the 19th century, in foreshadowing key features of computational imaging within the broader framework of operational images. According to Jussi Parikka, operational images[1] perform functions rather than merely representing phenomena. Predating the digital era,[2] these procedures rely on practices such as measurement, extraction, manipulation, and data recording, encompassing 19th century techniques like the discovery and measurement of ‘dark heat’ and ultraviolet rays, photogrammetry, photometry, and stellar spectroscopy.
In contrast, computational images[3] are digital operational images generated and analysed algorithmically, presenting extracted and calculated data, translating them into visible patterns that make otherwise imperceptible phenomena recognisable and analysable to the human eye. They represent a reconfiguration of operational images into more advanced forms of data visualisation.
Secchi occupies a pivotal position in this transition for two key reasons. First, he was among the earliest to systematically employ spectroscopy to study stellar spectra; second, his approach combined qualitative and quantitative methods – although predominantly qualitative, his effort to establish a star taxonomy contributed to nascent form of quantification.
Despite limitations and an imbalance between these approaches, Secchi’s work reveals several characteristics of operational images that prefigure computational imaging. These include invisuality, where spectroscopy interprets data to infer details beyond direct human perception, and the systematic collection of extensive datasets, foreshadowing modern big data practices. Secchi’s spectral classification also involves pattern recognition, a core principle in computational imaging. Furthermore, the analysis of distant stars’ chemical and physical composition in stellar spectroscopy anticipates the remote sensing techniques central to contemporary imaging.
While not aiming to be exhaustive, this study seeks to shed new light on Secchi’s spectroscopic research, restoring stellar spectroscopy to its rightful place within the diverse 19th century context, where various types of operational images, from a media-archaeological perspective, configure themselves as precursors to contemporary computational imaging.
Astronomical observation and the culture of precision
[T]he modern refinement of large industrial machines has happily impacted the very science that had given them the impetus for precision. An incalculable advantage has come to astronomy from the perfection of optical instruments.[4]
With these words, the Jesuit scientist Angelo Secchi asserts that the increasing accuracy in the production processes of the emerging precision industry has compensated for the ‘imperfection of matter’.[5] He celebrates the triumph of optics and precision mechanics, the theoretical ideal of geometric exactness, and science over the precariousness of matter.
According to Carl Friedrich Gauss,[6] the highest precision could only be pursued by astronomers, ‘who are familiar with the finest means of observation’: telescopes, spectroscopes, sidereal clocks, and micrometres – an array of instruments dedicated to the precise measurement of natural phenomena. Astronomy, therefore, is inherently the science of precision, and the culture of precision finds its ideal environment in the astronomical observatory, where it consolidated and spread to other 19th century scientific practices, even influencing the ongoing trend towards the mathematisation of social sciences.
This was possible not only through astronomers’ access to the most advanced devices but also due to the advanced mathematical expertise in conducting and interpreting astronomical phenomena. Leading centres of expertise and innovation in optics often supplied the most widespread scopic devices. Notable examples include the partnership between the Soleil-Duboscq-Pellin dynasty and the Paris Observatory, and in Italy the workshop of Giovanni Battista Amici and Giovan Battista Donati, a pivotal figure in the early development of stellar spectroscopy.[7] Their work focused on optical arrangements, the nature of light, and perception, significantly contributing to the interpretation and representations of nature.
The astronomical observatory thus became a model for all observatory sciences, uniting observation techniques, visual skills, and mathematical calculation as foundational principles of astronomical observation. Its rigor lay in mathematically translating and analysing the positions and orbits of celestial bodies. The culmination of this approach was the discovery of Neptune by Urbain Le Verrier in 1846. Where the eye could barely discern the irregularities of a supposed perturbing star, Le Verrier demonstrated the existence of a previously unknown planet through the ‘power of calculation’.[8]
For centuries, the astronomer’s role had been to observe and calculate the orbits and trajectories of planets and satellites, and the masses and distances of stars, provided these were telescope-based observations. Indeed, Auguste Comte, the father of positivism, was a staunch opponent of any aspect of astronomy that could not be reduced to visual observation and the resulting determination of the shapes, distances, sizes, and movements of celestial bodies. He argued that ‘our positive knowledge of the stars is necessarily limited to their geometric and mechanical phenomena alone, without being able in any way to encompass other physical, chemical, and physiological research’.[9] Another fervent advocate of positional astronomy, Pierre-Simon Laplace, asserted that ‘the only way to know nature is to question it through observation and computation’,[10] reinforcing the role of mathematics in identifying and determining celestial mechanics.
The position held by Comte and Laplace, far from achieving unanimous consensus in astronomy, faced critical perspectives throughout the 19th century that envisioned astronomical observation based on different principles and with new instruments. In this regard, it becomes necessary to focus on the developments that, following the end of the ‘visibility postulate’[11] and the epistemic superiority of the unaided eye – marked by the triumph of optical instruments in astronomical observation with Kepler and Galileo – paved the way for the emergence of new forms of observation, such as stellar spectroscopy with its prismatic mapping of stars.
Lens and prism for a new astronomical observation
In the early 17th century, astronomical observation departed from the Aristotelian tradition, which regarded the eye as an organ intentionally connected to the essence of objects. This shift marked the transition from a conception of optics based on direct and intentional perception to a physical and causal theory, where light acts as a mediator and generator of images. This perspective paved the way for a strongly instrumental empiricism, in which optical devices not only supported vision but replaced it, radically redefining the nature of astronomical observation. This new paradigm, with optical instruments playing an increasingly pivotal role, laid the groundwork for the later development of spectroscopy – a technique in which observation is no longer based on direct visual perception of the phenomenon, but on the analysis of light itself to investigate the observed object.
Aristotelian optics was based on a philosophical framework where human vision was conceived as an intentional process, linking subject and object through the mediation of light. Vision was understood as an active process in which light conveyed the ‘species’ of visible objects to the observer, enabling an immediate comprehension of their essence.
This intentional and teleological conception of vision[12] was challenged by the theories of Kepler and Galileo, who replaced the idea of intentional transmission with a geometric model of light and vision. With Kepler, there is a ‘transformation of optics from a teleological theory of human vision into a causal theory of the production of the images by light’,[13] where light is no longer an element that provides ‘adequate images of visible objects for the intellect’,[14]but rather a physical agent that propagates according to mathematical and physical laws. Kepler introduces a geometric-causal view of light and the eye, conceiving vision as a passive and mechanical process, akin to that of the camera obscura. The eye passively records what the laws of optical geometry dictate: for Kepler, the pupil does not create the visual experience but receives it.[15]
This geometrisation of vision precludes any intuitive or immediate knowledge, leading instead to a mediated understanding based on the rational interpretation of optical laws. In other words, Kepler’s ‘artificiosa observationes’[16]established the indispensability of geometric optics for astronomical analysis and observation, ending the epistemic superiority of the unaided eye.
With Galileo, Kepler’s model was further emphasised, shifting the focus from optical processes to the integration of vision with the mathematical principles of the universe. By introducing the telescope as a systematic tool for observation, Galileo redefined visual perception as an act of rational knowledge. This was not merely a technological advancement, but a transformation of the observer’s role, who, through this new optical device, interpreted visual data in accordance with mathematical models. The telescope thus turned the object of vision into ‘constructed and calculated data’.[17]
Galileo conceived the telescope as a tool of reason, fundamentally altering the relationship between the human eye and the understanding of the universe. It not only extended sensory capabilities but established a new order of vision, where observation became a rational process. Vision was no longer about seeing, but about demonstrating, through measurement and comparison with mathematical models, that what was observed corresponds to a precise geometric description of the universe.
Another central aspect is Galileo’s recognition of the human eye’s fallibility, which introduced a new awareness of error in observational practices. He understood that the unaided eye, on its own, could not provide accurate knowledge of natural phenomena. The telescope, therefore, was not merely an extension of sight, but a precise instrument designed to redefine sensory perception and align vision with a rational and measurable order. Constructed according to mathematical principles, the telescope redefined sensory perception by granting access to the mathematical language in which the universe is written. In this sense, the telescope did not simply assist the eye; it ‘creates the senses anew’.[18]
However, optical instruments were not without flaws. Significant in this regard was Newton’s observation that light refraction through lenses caused colour dispersion and distortions, which were considered inherent in lens-based systems. This raised doubts about the reliability of both optical instruments and the human eye, already prone to perceptual errors. Natural vision could not overcome these physiological limitations, nor could lenses fully compensate for them. To address these challenges, the human eye’s role in experiments was progressively diminished in favour of instruments designed for precise mechanised measurements and standardised units. This shift redirected the focus towards the primary source of visual perception: light. Alongside lenses, interest grew in the characteristics and propagation laws of light, made evident through the use of prisms.
Initially, Newton believed that chromatic aberration was an inherent characteristic of all lenses,[19] as he thought the artefact was a consequence of light passing through the lens material itself. However, through his experiments with the prism, he discovered that light naturally disperses into its constituent colours, each possessing a specific refractive index. This discovery not only provided fundamental insights into the physical properties of light but also contributed to a more profound understanding of its nature.
Lenses and prisms coexisted as complementary scientific instruments: lenses amplified vision, while prisms revealed light’s intrinsic nature. The lens aided vision through its magnifying power, creating optimal physiological conditions for visual perception, while the prism generates optical effects imperceptible to the naked eye, such as the spectrum.
This dual tradition characterised the optical revolution, marked by a tension between trust in the eye and the growing reliance on optical instruments as precision tools. These theoretical and methodological divergences converged in many scientists’ interest in physiology, colour perception, and eye sensitivity.[20] Astronomers and instrument makers dedicated themselves to studying light and visual processes, examining errors and perceptual illusions, and developing theories and practices related to optical-lens configurations. Their efforts aimed to create optimal conditions for reliable and standardised measurements, minimising subjectivity and visual fatigue.
Another crucial issue that characterised scientific work between the late 18th and early 19th centuries was the search for a method to quantitatively measure light, which could not be directly perceived or calculated. Light always requires an intermediary device, as optical phenomena lack their own measurement units, unlike weight or length.[21] As optical phenomena varied in intensity rather than in extension, direct measurement was challenging. For instance, two identical lights do not double the perceived brightness. Without oversimplifying this historical reconstruction, the solution to converting light’s properties into quantifiable and objective measurements became possible through the convergence of two key developments: understanding the eye’s achromatic nature, whose lenticular architecture suggested the possibility of constructing achromatic lenses to compensate for this physiological limitation, and applying analytical mathematics to describe continuous phenomena, such as optical ones, providing a detailed understanding of the laws governing them.
The combined efforts to study the nature of light, extend optical precision beyond human perception, and address visual limitations led to the construction of an early spectroscope, combining a prism, an achromatic telescope, and a theodolite. This arrangement enabled the conversion of spectral images into geometric signals, with each colour corresponding to a precise angular position. Newton’s initial attempts to standardise refractive indices were based on subjective perceptions of reference colours. Fraunhofer[22] was the first to standardise light quantification, establishing the prismatic scale of colours based on mathematical evidence through what would later be known as Fraunhofer lines. He identified 574 dark lines in the solar spectrum, which became a standard reference for subsequently measuring the wavelength of light. These lines, mapped through geometric angles using a theodolite – a tool that symbolically united geodetic and astronomical sciences – prefigured the unity of the universe’s forces,[23] as Secchi’s spectroscopy would later attempt to demonstrate.
Methodologies and innovations in nineteenth-century spectroscopy
The prismatic spectrum introduced a new mode of vision, a scopic domain no longer tied to light’s revelatory action but to its unprecedented ability to generate optical effects not available in nature or directly perceivable by the unaided eye.[24]This marked the transition of optics from a purely observational science to an experimental one. Newton, in his NewTheory of Light and Colours (1672), was already aware of this generative power through his early prismatic experiments. However, it was not until 1814-1815 that Fraunhofer produced the first spectrum map, depicting the visible solar spectrum and engraving each dark absorption line with corresponding letters for identification.
Fig. 1: Fraunhofer’s undated hand-drawn pencil sketch of the solar spectrum, featuring labelled major lines.
Fraunhofer’s engraving remained a standard for over 45 years in terms of precision and detail, but the conversion of the spectrum into geometric parameters still relied on the optical qualities of the prism and the physiological capabilities of the human eye. Despite Fraunhofer’s interest in transforming refrangibility into angular positions using a theodolite and an achromatic telescope, his map still reflected a predominantly qualitative description of the spectral phenomenon, employing Newtonian terminology for chromatic regions.
In these early decades of spectral analysis, methodologies and terminologies were in flux. Descriptions of absorption phenomena were primarily verbal and dependent on each observer’s abilities, leading to a lack of consensus regarding the number, intensity, and spacing of lines, as well as the differentiation of chromatic regions. For example, David Alter attempted to limit subjective descriptions by using tables that assigned metals a certain number of lines corresponding to Newton’s seven primary colours.
Fig. 2: David Alter’s 1854 illustration of spark spectra, presented as a tabulated count of lines across seven colours.
However, the tables continued to include subjective descriptions such as ‘faint yellow line’ or ‘in the orange a very bright band, one of yellow and one of green – two faint bands in the blue which are not always seen’.[25] The pursuit of greater objectivity involved various attempts, such as Sir David Brewster’s alphanumeric reference system for labelling intermediate spectrum lines between Fraunhofer’s already-identified lines, or William Allen Miller’s 1845 identification of recurring patterns in line distribution.
These methodologies reveal early attempts at transitioning from qualitative to quantitative approaches in spectroscopy, moving from discursive descriptions to efforts at standardisation and objectification using tables and charts to depict accurate positions and intervals between spectrum lines. The shift towards a quantitative approach involved introducing techniques and instruments that allowed precise and repeatable measurement of spectrum lines. In 1861-1862, Gustav Kirchhoff introduced numerical scales in his solar spectrum map, assigning numerical values to the positions of spectrum lines.
In addition, in collaboration with Bunsen, Kirchhoff identified dark absorption lines with luminous emission lines, initiating the chemical analysis of various light sources, including stellar light.[26] This enabled accurate measurement and standardised comparison between different observations, allowing Kirchhoff and his successors to identify specific chemical elements in the Sun by comparing solar spectrum lines with those of terrestrial elements. They discovered elements such as iron, sodium, calcium, magnesium, chromium, and nickel in the Sun, suggesting that ‘the physical constitution of the fixed stars resembles that of the Sun’.[27]
Fig. 3: A section of Kirchhoff’s solar spectrum map from 1861.
In 1868, Anders Ångström published a solar spectrum atlas that included precise measurements of spectrum lines in wavelength, using units of ten-millionths of a millimetre (10-10 m). The transition to quantitative measurement techniques was gradual and involved the adoption of the wavelength as the standard unit for spectrum lines.
Moreover, advances in spectroscope technology, including high-dispersion prism spectroscopies and diffraction gratings, enhanced the resolution and precision of spectral observation, enabling detailed mapping of the solar spectrum. These developments marked the initial steps in transforming spectroscopy from a qualitative practice into a rigorous quantitative science through improved accuracy and standardisation, facilitating comparisons between observations.
Fig. 4: Merz’s simple direct-vision spectroscope.
Fig. 5: Compound spectroscope by Huggins 1868.
Fig. 6: Secchi’s compound stellar spectroscope.
Despite inevitable technological advancements in instruments for recording dispersions, the scientist’s eye has always played an active role, even under the strain it was subjected to during observations. As Secchi recalls in reference to Kirchhoff, he admitted that his spectrum map was incomplete, ‘owing to [his] eyes being weakened by the continual observations which the subject rendered necessary’.[28]
Photography in spectroscopy
Despite acknowledging the limitations of visual observations, photography initially failed to compensate for these shortcomings in the 19th century. Although it was celebrated as an automatic recording tool, its integration into spectroscopy was complex and delayed. Photography did not immediately become a guarantor of ‘mechanical objectivity’[29] but served instead as an auxiliary tool to support visual observations, which were often more reliable than photographic recording and reproduction.
Photography faced several disadvantages that limited its use in spectroscopy. It could not compete with visual observations in recording spectra, nor with lithography in reproduction and printing processes. Many spectra were not ‘photogenic’ due to their weak intensity, and exposure times were insufficient as the wet emulsion evaporated too quickly. Additionally, the wet collodion’s uneven sensitivity to different spectrum regions required assembling spectral fragments into a composite view. Consequently, around 1860, the use of wet plates was inevitably supplemented by the traditional methods such as drawing and lithographic printing.
Even in duplication and printing, photography encountered significant challenges. Scientists who relied extensively on photography often had to depend on lithography to publish their results. For many years, photography could not match the quality and precision of spectrum maps produced through lithography or steel engraving. Only with the advent of advanced photomechanical reproduction techniques could photographs be effectively transferred to print, ensuring the accuracy and quality required by scientific publications.
The presumed mechanical objectivity of photography as a scientific recording device struggled to assert its evidentiary value in spectroscopy, despite the efforts of pioneers like Henry Draper, Lewis M. Rutherfurd, and Henry A. Rowland, who aimed to reinforce the almost dogmatic belief in ‘nature registering herself’.[30] Contrary to their convictions, human intervention was unavoidable at every stage – from adjusting exposure times to selecting plates and filters, and enhancing or manipulating images during development. Final retouching of negatives and selecting the best positive prints for publication required critical judgment, demonstrating the indispensable role of human expertise and ‘trained judgment’.[31] Despite not becoming the quintessential tool of mechanical objectivity in 19th century spectroscopic practices, photography remained a constant presence alongside visual observations. The human eye and photography proved to be mutually indispensable and complementary tools.
By the 1880s, the combined use of visual observations and photographs had become well-established. The choice between visual observations, photography, or both depended on the spectral range being studied and the specific research objectives. Photographs were effective in resolving spectra from the F line through the blue, violet, and ultraviolet regions, while visual observations were preferred for the red-to-green region, between the D and F lines.[32]
Another form of complementarity between the two processes was that photographs revealed faint lines inaccessible to the human eye through prolonged exposure times, while visual observations allowed for distinguishing closely spaced lines that appeared as a single line in photographs. In brief, the advantages of using photography included revealing spectroscopic details invisible to the naked eye through the enlargement of specific portions of spectrum maps; recording faint spectrum lines with prolonged exposure times; photographing different spectral regions with varied exposure times to circumvent the problem of uneven emulsion sensitivity, thus creating a composite image; and extending the analysis to spectral regions otherwise inaccessible to the human eye.
Therefore, contrary to definitions of spectroscopic photography as a ‘portrait of nature’ or a ‘chemical retina’,[33]understanding its onto-epistemology requires partially abandoning its conception as the ‘pencil of nature’, as suggested by the etymology of the term coined by John F. W. Herschel. Photography is not only a ‘writing’ tool but also a technique of mechanical and partly automatic readability. As Lecoq de Boisbaudran wrote:
Light, which arrives to us from the depths of space, brings us the table of reactions of these bodies which we do not possess and perhaps never shall! It is up to us to learn to read it.[34]
This interpretation situates the main stages of photographic archaeology at the dawn of mechanical readability techniques around 1800, when visual observation of light began incorporating thermometric and chemical methods to reveal and detect invisible light. Supporting this interpretation are the pioneering discoveries of infrared and ultraviolet radiation. Around 1800, William F. Herschel recorded heat along the visible spectrum and beyond its red limit, detecting an invisible radiation he called ‘dark heat’.[35] In 1801, Johann W. Ritter discovered ultraviolet rays through silver chloride exposure – a direct precursor to photography – mapping the limits of human visual sensitivity.
Fig. 7: William Herschel’s comparison of visual intensity (curve R) and thermometric intensity (curve S) in the spectrum. Herschel’s diagram exemplifies an image that makes the invisible visible.
Shortly after Arago’s announcement of Daguerre’s invention, Jean Baptiste Biot emphasised that the daguerreotype’s emulsion was sensitive to actinic and calorific rays. This made photography a tool for reading the deep nature of the universe, translating physical signals into measurements and data. In spectroscopy, photography can thus be juxtaposed with drawing, graphs, diagrams, and tables – ‘proto-data media’ that make invisible forces visible.[36] These instruments capture the information of a multidimensional universe on a two-dimensional medium, allowing us to read ‘the physical nature of the stars, and reveal the quality of the matter that composes them’.[37]
Secchi’s role in spectroscopy and astrophysics
Although Secchi did not use photography to create his stellar maps – whose first satisfactory examples are attributed to Rutherfurd (1864), Draper (1872), and Rowland (1886 and 1888) – he recognised the potential of photographic devices for detecting phenomena invisible to the human eye. Throughout his scientific career, Secchi employed drawing and traditional printing techniques for his spectrum maps. However, he did not hesitate to use photography and other devices to observe and measure the invisible effects of electromagnetic radiation. As early as 1851, he used Melloni’s thermomultiplier to measure temperature variations caused by infrared radiation and worked with daguerreotypes to demonstrate the chemical action of light.[38]
Secchi was not averse to using photography in astronomy. From 1851, he began experimenting with wet collodion and produced his first photographs of the Moon, culminating in a photographic atlas of the lunar surface in 1857.[39] He photographed the Sun and other celestial bodies,[40] primarily using the photographic device to detect the invisible and to automate the recording processes. In this context, photography played a crucial role in demonstrating not only human physiological limitations but also in measuring the relativity of sensory responses to eliminate the subjective component from the study of phenomena. Secchi placed great hopes in photography and the mechanisation of measurement devices. He envisioned that integrating photographic and electrical technologies would progressively eliminate human intervention, improving the accuracy and reliability of astronomical observation.
Secchi significantly contributed to the birth and development of spectroscopy and astrophysics through an elaborate star classification system focused on understanding their physical and chemical nature. Despite the enormous variety of stars, he observed that their spectra could be grouped into a few well-defined types.[41] This reinforced the conclusion drawn by Bunsen and Kirchhoff in 1859 that the material composition of the Sun, many other stars, and Earth was essentially identical.
From 1852 onwards, Secchi described many stellar spectra in detail. He based stellar spectroscopy on three fundamental points: the general colour of the stars, the position of the spectrum lines, and comparisons with artificial flames and the Sun. In the early 1860s he was among the first, alongside William Huggins, Giovan Battista Donati, and Lewis M. Rutherfurd, to contribute to stellar spectroscopy by compiling a catalogue of star spectra, classifying each star by colour, spectral type, and spectrum line intensity.
Fig. 8: Absorption spectrum characteristic of Secchi’s classification from Type I to Type V.
Secchi’s methodology was based on a comparative approach aimed at the logical-phenomenological distinction of types, classified by well-defined physical and chromatic characteristics. This classificatory process, based on recognising similarities between the stellar spectra and predetermined standard spectra, facilitated star cataloguing and established light as a kind of natural unit of measurement intrinsic to celestial bodies and the universe, detached from any conventional and material standard.
Secchi prioritised recognising the gestalt, patterns, and typicality of certain line groups. Spectrum maps, with lines, grids, and alphanumeric elements, approximate drawing to photography as technical images, since both produce measurements and data within a diagrammatic representation that Parikka defines as the culture of invisuality.[42] Here, the numbers, letters, and lines of spectroscopy delineate an archaeology of data computation and automation, signalling a shift from the primary of representation to that of data operations. This transition is further underscored by the notion that the map, conceptualised as a ‘calculation interface’, functions as a bridge between pre-digital instrumental images and computational operational images.
From visibility to data: Secchi’s stellar spectroscopy as the archaeology of computational images
With the analysis presented thus far, the objective has been to explore the material epistemology underlying spectroscopic techniques, shedding light on the debates and comparisons among key scientists from approximately 1814 to 1888. This field exhibits significant complexity, both in techniques used and goals pursued. Among these figures, Secchi holds a central role for two main reasons: he was one of the first to systematically direct the spectroscope towards the sky, and he contributed significantly to the birth of astrophysics by developing the first taxonomy of stellar types.
Table 1: Presenting the correspondence between Secchi’s classification and the Henry Draper Catalogue system developed at the Harvard Observatory.
Secchi’s work fits within a broader 19th century effort to unify the sciences of the heavens and the Earth into a cohesive scientific framework beyond disciplinary boundaries. This aspiration also found fertile ground in stellar spectroscopy, reflecting a shared aim among scientists of the time for a unified vision of science. For instance, in his Traité philosophique d’astronomie populaire (1844), Comte proposed that such unity should be based on a single cognitive approach. In contrast, Secchi rooted this unity in the abstract principle of energy conservation, anticipating a trend that would later become the norm.[43]
In this context, stellar spectroscopy supports an epistemic shift from visual technologies producing mimetic representations to what can be termed the ‘invisual’. This concept refers to the use of ‘non-optical’ instruments to create abstract models, translating phenomena into images conceived as data repositories. This shift prefigures key characteristics of computational visualisation techniques, positioning spectroscopy as ‘an already “digital” analogue practice’.[44]
Secchi’s taxonomic system began to shift towards a data-driven approach, integrating human perception with systematic data interpretation. His spectral types organised starlight into a measurable alphanumeric system that facilitated cataloguing and quantitative analysis. According to Secchi,
The phenomenon of vision through light is certainly remarkable […], yet despite the great importance this fact holds for us, […] the physicist can only regard the illuminating action of radiation as secondary phenomenon, merely subordinate to another, far more general law.[45]
Secchi’s stellar types aimed to transform spectra into a proto-set of quantifiable data, laying the foundation for a systematic theory of stellar composition. In contrast, other spectroscopists, while contributing to new measurement and visualisation techniques, did not prioritise an organised and systematic classification of stars that synthesised qualitative and quantitative criteria to enable analytical operations beyond mere description of phenomena. Secchi stated,
How are so many diverse phenomena connected? Only carefully studied and arranged facts can enlighten us in this difficult discussion, and any theory that seeks to take shape must be preceded by an orderly exposition and review of empirical data. […] Connecting the immense number of phenomena that constitute these manifestations of these forces and demonstrating their mutual connection is what I have endeavoured to achieve in these pages.[46]
Secchi’s spectral series enabled the identification of regularities, inference, and the construction of theoretical models, using the prism as a crucial epistemic device, not only to extend vision, but to decode light based on geometric and numerical principles. Secchi’s ‘physical catalogue of stars’[47] represents a systematic attempt to organise astral knowledge through a qualitative-quantitative approach based on the analysis of large datasets. While this study cannot delve into the role of statistics in science, particularly in spectroscopy, Secchi’s taxonomic work can be positioned within what Galison[48] describes as the ‘statistical tradition’ of scientific research, which prioritises the aggregation and systematic organisation of large datasets over the isolated identification of a ‘golden event’. Although Secchi’s approach did not fully align with the principles of this tradition, it foreshadowed a shift towards a logic of data evaluation that would later become dominant, for example, at Harvard Observatory with the Henry Draper Catalogue.[49]
Spectral maps are images that display data, revealing how these spectral series, through their distribution of characteristics and properties, represent one of the genealogical lines leading to computational pattern recognition and machine vision as an epistemically significant principle. Secchi’s series can be seen as an embryonic stage of today’s big data culture, where an image, taken individually, holds little significance but functions as part of a large and more complex network, akin to what we define as a database.
At this point, it becomes clear that stellar spectroscopy can be considered one of the early techniques of data visualisation that began to emerge as early as the 17th century, marking a distinct departure from traditional Renaissance visual forms. As Cubitt suggests – adapted here to fit our context – data visualisation represents ‘the first new form of visual culture to arise since the Renaissance, as a new symbolic form’.[50] This shift allows us to differentiate the role of mathematics in Renaissance visual practices from the mathematical contributions made in spectroscopic imaging.
For instance, unlike Renaissance perspective, which aimed to create faithful or plausible representations of visible reality, or later techniques such as photogrammetry and photometry, which sought to capture and measure what the eye could see or might see, spectroscopic imaging did not focus on directly depicting visible spaces or objects. Instead, the spectral lines, representing light frequencies absorbed or emitted by chemical elements, required interpretation as data, rather than direct visual representation. This shift marks a transition to what could be called a ‘radiographic episteme’,[51] where the act of seeing is transformed into a process of analysis and revelation through extracting and decoding data.
In contrast to the Renaissance episteme, which focused on geometric representation and spatial organisation centred on the observer, the radiographic episteme introduces an analytical dimension that transcends visual perception. Spectroscopy embodies this transition from a descriptive to an analytical and quantitative vision, where spectroscopy is not a traditional optical technique but a mode of knowledge that penetrates the visible surface and reveals the internal structure of light, analysing its composition like an anatomy of radiation.
With Secchi’s spectroscopic series, we observe a gradual departure from the ‘camera body imitative of the human eye socket and lenses designed to reproduce “perspective as a symbolic form”’.[52] Instead, there is an embrace of the unity of physical forces, echoing Secchi’s title,[53] in a universe no longer defined by a fixed focal point or primarily visual orientation. Although the definitive transition of spectroscopy from a predominantly phenomenological-empirical domain to a theoretically consistent interpretation occurs with Bohr’s 1913 quantum hypothesis – replacing visual skills observation with calculation – Secchi’s spectral series had already established their role as alphanumeric codes, positioning spectroscopy as a technique for measuring light rather than representing it. Spectroscopy emerges as a technique for measuring light itself – a form of meta-photography in the etymological sense of the term – that shares with computational imaging the prerogative of being a measurement of light, a measurement that ‘never yields the “sign” of a thing, but only its measure, a signal value, a number’.[54] The spectroscopic image is therefore a system of signals ‘susceptible to interpretation and pattern recognition’,[55] not a system of signs.
In this sense, spectral maps as operational images – no longer representative or mimetic, but designed to visualise the invisual nature of light – anticipate how contemporary computational images transform and translate invisible phenomena into readable and interpretable data. Drawing on Ernst’s concept of a ‘paper machine’,[56] these spectral maps employ alphanumeric characters, calculations, and classifications to enable inferences and deductions, analyses, and comparisons. They represent, in nuce, one of the distinctive features of computational images: the idea that the image is, above all, a set of data, outlining a genealogy that extends from perspective to the diagrammatic.
Furthermore, it is plausible to consider spectroscopy as a technique that, in some way, anticipated what John May attributes to the CCD[57] of the digital camera. Both rely on a process of detecting and fragmenting energy: just as the CCD sensor detects and breaks down light into discrete, measurable electrical charges, spectroscopy similarly fragments electromagnetic radiation into discrete, measurable lines. Additionally, digital images – regardless of their visual nature or methods used for their organisation – are always visualisations in the sense attributed by Cubitt, as ‘specific orderings of data in a form perceptible to humans’ whether they are digital photographs or their respective histograms.[58]
Another anticipatory element of Secchi’s stellar spectroscopy in relation to computational images is the increasing distance between detection devices and direct sensory observation. The growing separation between observer and observed leads to visions increasingly detached from immediate human perception. We are no longer dealing with images perceived directly by the human eye, but with highly conventional configurations, where perception and detection undergo processes of deconstruction – analysing electromagnetic radiation by decomposing it as it passes through the prism – and reconstruction, synthesising the spectral lines onto a common photosensitive or printed medium. The goal is to produce and visualise parameters for establishing potential connections between different orders of phenomena.
Secchi’s classification grid provided a framework for later advancements in spectral classification and the remote sensing of the chemical composition of stars, continued at Harvard Observatory with new tools and methodologies. In that context, pattern recognition and remote sensing contributed to emergence of a new cultural technique, where computational machines would became the driven force in organising invisual observations into operational actions.[59]
It is no coincidence that Parikka cites the collection and cataloguing of astronomical photographs at Harvard Observatory as a precursor ‘to contemporary preparation of images into datasets into models’,[60] where Secchi’s manual inscriptions transitioned into automatic inscriptions, culminating in the Henry Draper Catalogue, which further refined Secchi’s stellar classification. Spectroscopy, with its capacity to transduce the invisible into data, represents a pivotal point in the history of visualisation techniques – a sort of proto-computational imaging that prefigures many of the modern technologies underpinning contemporary visualisation techniques.
Conclusion
Secchi’s contributions to spectroscopy mark a pivotal turning point for 19th century astronomy. Through his spectroscopic classification of stars, Secchi introduced an approach that transcended mere visual representation of celestial objects. In a period when fully quantitative methods were not yet feasible, Secchi worked to integrate qualitative and emerging quantitative approaches, laying the groundwork for a more integrated scientific practice. This shift not only influenced astronomical methods but also helped pave the way for the emergence of computational imaging.
Secchi was the first to propose a systematic classification based on the spectral characteristics of stars. Even though his system was eventually replaced by the Draper Catalogue, his work established the groundwork for later quantitative and systematic methods. By organising stars into a structured system, Secchi’s approach demonstrated the potential of spectral analysis for revealing the physical and chemical properties of stars, moving beyond traditional observational practices.
Although Secchi’s spectral classification system was later refined by more quantitatively-driven techniques, it remains fundamental in understanding the development of systematic methodologies in astrophysics. His innovative use of spectroscopic tools and his structured collection of data combined empirical observation with a growing emphasis on scientific standardisation.
In addition, Secchi’s systematic efforts to catalog stellar spectra can be seen as a precursor to modern practices of big data management and pattern recognition in data visualisation. His spectral types represent early attempts to encode relationships between phenomena, anticipating the transition from visual representation to what we now define as invisuality in operational computational images. Moreover, his cataloguing approach helped to lay the groundwork for the creation of repositories that systematise and make vast amounts of data accessible. His methods of analysing distant celestial objects also foreshadowed remote sensing techniques.
It is important to emphasise, however, that although the structure of this analysis may suggest a chronological progression, it is intended solely to aid in understanding the complexity of the subject. The historical reality of scientific innovations is far more nuanced and discontinuous. This structure seeks to highlight key intersections without implying a linear or necessary sequence of events.
Additionally, it is essential to acknowledge that spectroscopy was not the only technique involved in the transition towards computational imaging. Numerous other scientific innovations of the 19th century also played a key role in laying the groundwork for this pivotal shift in the history of visualisation technologies.
Nevertheless, this essay aims to grant Secchi’s spectroscopy the recognition it has often been denied. Its significance deserves to be reconsidered not only in the context of the history of astronomical science, but also as a foundational contribution to many core features of contemporary digital techniques, rooted in the era of emerging astrophysics.
Author
Francesco Giarrusso earned a PhD in Communication Sciences from Nova University of Lisbon in 2013 and has been a member of the Center for Philosophy of Science at the University of Lisbon since 2012. He is currently pursuing a second PhD at Catholic University of Milan, focusing on images and astronomical devices, with research interests in Visual Studies and Media Archaeology.
References
Aubin, D., Bigg, C. and Sibum, H. (eds). The heavens on earth: Observatories and astronomy in nineteenth-century science and culture. Durham: Duke University Press, 2010.
Chen, X. Instrumental traditions and theories of light: The uses of instruments in the optical revolution. Dordrecht-Boston-London: Springer, 2000.
Chinnici, I. and Consolmagno, G. (eds). Angelo Secchi and nineteenth century science: The multidisciplinary contributions of a pioneer and innovator. Cham: Springer, 2021.
Chinnici, I. Decoding the stars: A biography of Angelo Secchi, Jesuit and scientist. Brill: Leiden, 2019.
Comte, A. Cours de la Philosophie Positive (mathematiques, astronomie, physique et chimie), Tome Deuxième. Paris: Bachelier, 1835: 8-9.
Daston, L. and Galison, P. Objectivity. New York: Zone Books, 2007.
Diodato, R. and Eugeni, R. ‘L’immagine algoritmica: abbozzo di un lessico’, L’immagine algoritmica tra media, arte, società, No. 41-42, 2024: 5-21.
Dvořák, T. and Parikka, J. ‘Measuring photographs’, Photographies, Vol. 14, No. 3, 2021: 443-457.
Ernst, W. Technologos in being: Radical media archaeology & the computational machine. New York: Bloomsbury Academic, 2021.
Eugeni, R. Capitale algoritmico. Cinque dispositivi postmediali (più uno). Brescia: Editrice Morcelliana, 2021.
Gal, O. and Chen-Morris, R. ‘Empiricism Without the Senses: How the Instrument Replaced the Eye’ in The body as object and instrument of knowledge: Studies in history and philosophy of science, edited by C. Wolfe and O. Gal. Dordrecht: Springer, 2010.
_____. ‘Baroque Optics and the Disappearance of the Observer: From Kepler’s Optics to Descartes’ Doubt’, Journal of the History of Ideas, Vol. 71, No. 2, 2010: 191-217.
Galison, P. Image and logic: A material culture of microphysics. Chicago: University of Chicago Press, 1997.
Grespi, B. ‘Archaeology of the Postphotographic Image’, L’immagine algoritmica tra media, arte, società, No. 41-42, 2024: 119-141.
Hagen, W. ‘Die Entropie der Fotografie. Skizzen zu einer Genealogie der digital-elektronischen Aufzeichnung’ in Paradigma Fotografie. Fotokritik am Ende des fotografischen Zeitalters, edited by H. Wolf. Frankfurt-Main: Suhrkamp, 2002.
Henning, M. and Mikuriya, J. ‘Light Sensitive Material: An Introduction’, Photographies, Vol. 14, No. 3, 2021: 381-394.
Hentschel, K. Visual cultures in science and technology: A comparative history. Oxford: Oxford University Press, 2014.
_____. Mapping the spectrum: Techniques of visual representation in research and teaching. Oxford: Oxford University Press, 2002.
Maynard, P. ‘Photo Mensura’ in Reasoning in measurement, history and philosophy of technoscience, edited by N. Mößner and A. Nordmann. Abingdon-on-Thames: Routledge, 2017.
Packer, J. ‘Screens in the sky: SAGE, surveillance, and the automation of perceptual, mnemonic, and epistemological labor’, Social Semiotics, Vol. 23, No. 2, 2013: 173-195.
Parks, L. ‘Vertical Mediation’ in Life in the age of drone warfare, edited by L. Parks and C. Kaplan. Durham: Duke University, 2017: 134-157.
Parikka, J. Operational images: From the visual to the invisual. Minneapolis: University of Minnesota Press, 2023.
Parikka, J. and Dvořák, T. (eds). Photography off the scale: Technologies and theories of the mass image. Edinburgh: Edinburgh University Press, 2021.
Secchi, A. Le stelle. Saggio di astronomia siderale. Milan: Fratelli Dumolard, 1877.
_____. Sugli spettri prismatici delle stelle fisse. Rome: Tipografia delle Belle Arti, 1868.
_____. Le scoperte spettroscopiche in ordine alla ricerca della natura de’ corpi celesti. Rome: Tipografia delle Belle Arti, 1865.
_____. L’unità delle forze fisiche. Rome: Tipografia forense, 1864.
_____. Sui recenti progressi dell’astronomia. Rome: Tipografia delle Belle Arti, 1859.
Vogl, J. ‘Becoming-media: Galileo’s Telescope’, Grey Room, No. 29, 2007: 14-25.
www.fotomuseum.ch/en/2017/01/09/image-after-i-photography-as-print-and-as-scientific-instrument/ (accessed on 31 October 2024)
[1] See Parikka 2023. For accuracy, the concept of operational image was originally developed by Harun Farocki.
[2] As Grespi (2024) reminds us, Farocki suggests the idea that the operational image predates digital technology in his 1986 film Images of the World and the Inscription of War.
[3] The bibliography on the subject is vast. For this essay, I have mainly relied on Eugeni 2021, and Diodato & Eugeni 2024, pp. 5-21.
[4] Secchi 1859, p. 3.
[5] Ibid.
[6] Gauss quoted in Aubin & Bigg & Sibum 2010, p. 9.
[7] For further details, see Secchi 1865; Secchi 1868; Secchi 1877.
[8] Secchi 1859, p. 18.
[9] Comte 1835, pp. 8-9.
[10] Laplace quoted in Aubin & Bigg & Sibum 2010, p. 4.
[11] Blumenberg quoted in Volg 2007, p. 21.
[12] The teleological approach to optics in the 17th century viewed vision as an intentional process, wherein the visual faculty directly communicated with visible objects through agents like ‘species’ or ‘visual rays’. These agents were considered essential for ensuring the truthfulness and adequacy of perceived images. According to Pecham (quoted by Gal & Chen Morris in Gal & Wolf 2010, pp. 136-137), ‘a species produced by a visible object has the essential property of manifesting the object of which it is the likeness’.
[13] Gal & Chen-Morris in Gal & Wolf 2010, p. 138.
[14] Ibid., p. 136.
[15] Ibid., pp. 134-138.
[16] This definition appears as the subtitle of Kepler’s work Ad Vitellionem Paralipomena (1604).
[17] Volg 2007, p. 7.
[18] Ibid., p. 17.
[19] See Chen 2000.
[20] Significant contributions include the studies of Thomas Young, David Brewster, and Hermann von Helmholtz on colour perception. Likewise, Zöllner’s sensory investigations, Janssen’s physiological research on the eye, and Fraunhofer’s analyses of spectral sensitivity in colours provide noteworthy examples.
[21] Regarding the quality of light, refer to Whewell quoted in Chen 2000, pp. 118-121.
[22] The origins of stellar spectroscopy trace back to Fraunhofer’s early spectral maps; in 1823, using an objective prism spectrometer, he observed that spectral lines appeared in different positions depending on the astronomical bodies.
[23] See Secchi 1864.
[24] See Chen 2000, pp. 109-128.
[25] Alter quoted in Hentschel 2002, p. 40.
[26] Notably, Huggins – one of the founding figures of astrophysics – began his work in stellar spectroscopy following the arrival in England of news regarding Kirchhoff and Bunsen’s discoveries. See Hentschel 2002, p. 346.
[27] Roscoe quoted in Hentschel 2002, p. 345.
[28] Kirchhoff quoted in Hentschel 2002, p. 140.
[29] See Daston & Galison 2007.
[30] See Hentschel 2002, pp. 454-455.
[31] See Daston & Galison 2007.
[32] See Hentschel 2002, p. 350.
[33] Ibid., p. 220.
[34] Quoted in Hentschel 2002, p. 308.
[35] Hentschel 2014, p. 49.
[36] In this context, it is worth recalling Newton’s description of the apparatus with crossed prisms, which demonstrates how light passing through two prisms creates a continuous curve representing the variation of refractive indices with wavelengths. This process, where the distribution of coloured light is generated through interaction with prisms, can be regarded as a self-generating spectral diagram.
[37] Secchi 1865, p. 4.
[38] Ibid., p. 14.
[39] Chinnici 2019, pp. 61-62.
[40] Secchi 1859, pp. 158-160.
[41] Although Secchi’s system was later replaced by the Harvard system, developed by Edward Pickering, Williamina Fleming, Antonia Maury, and Annie Jump Cannon between the 1890s and early 1900s, Secchi is recognised as a fundamental pioneer in stellar classification. His stellar classification system comprises five main types:
Type I: White-blue stars, like Sirius, with a spectrum characterised by seven colours interrupted by four large black lines.
Type II: White-yellow stars, similar to the Sun, with a spectrum formed by thin black lines.
Type III: Red or orange stars, like Alpha Orionis, with a spectrum featuring nebulous bands and black lines.
Type IV: Very red stars, rich in carbon.
Type V: Celestial bodies with pronounced emission lines.
[42] See Parikka 2023.
[43]Aubin & Bigg & Sibum in Aubin & Bigg & Sibum 2010, p. 8.
[44] Grespi in Diodato & Eugeni 2024, p. 123.
[45] Secchi 1864, p. 131.
[46] Ibid., p. 2.
[47] Secchi 1877, p. 65.
[48] See Galison 1997.
[49] Pickering published the first Draper Catalogue of Stellar Spectra in 1891, documenting 10,351 stars. This work laid the foundation for the Henry Draper Catalogue, which detailed the magnitude, position, and classification of 225,300 stars across a series of volumes published between 1918 and 1924.
[50] www.fotomuseum.ch/en/2017/01/09/image-after-i-photography-as-print-and-as-scientific-instrument/
[51] For further examination, see Packer 2013; Parks in Parks & Kaplan 2017, pp. 134–157.
[52] Cubitt in Parikka & Dvořák 2021, p. 31.
[53] Secchi 1864.
[54] Hagen in Wolf 2002, p. 235.
[55] See Parikka 2023, pp. 57-64.
[56] As Ernst (2021) explains, a ‘paper machine’ denotes theoretical or algorithmic processes carried out on paper, serving as precursors to computational systems. These written algorithms bridge abstract theory with concrete operations, prefiguring the mechanisation and automation in digital technologies.
[57] In this regard, it is noteworthy how Hagen highlights the close relationship between spectroscopy and digital photography, emphasising that spectroscopy paved the way for quantum mechanics – a crucial development for the advancement and implementation of the semiconductors used in digital camera sensors. See Hagen in Wolf 2002, pp. 195-235.
[58] Cubitt in Parikka & Dvořák 2021, p. 30.
[59] As suggested by Parikka (2023), consider platforms such as DeepMind, neural networks, machine vision, and autonomous vehicles – all of which are examples of machine learning systems that rely on images as training datasets and models.
[60] Parikka 2023, p. 84.