Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-23T17:00:56.580Z Has data issue: false hasContentIssue false

Onboard background-oriented schlieren imaging using consumer-grade hardware

Published online by Cambridge University Press:  21 November 2024

M. K. Quinn*
Affiliation:
School of Engineering, University of Manchester, Manchester, United Kingdom
W. J. Crowther
Affiliation:
School of Engineering, University of Manchester, Manchester, United Kingdom
K. Wood
Affiliation:
School of Engineering, University of Manchester, Manchester, United Kingdom
K. Kabbabe
Affiliation:
School of Engineering, University of Manchester, Manchester, United Kingdom
*
Corresponding author: M. K. Quinn; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

A demonstration of a fully onboard method for generating background oriented schlieren (BOS) data on a jet exhaust is presented. Readily available commercial camera equipment is used to capture in-flight imagery of a miniature jet engine exhaust mounted on a custom-built model aircraft. The setup for image acquisition and processing algorithms are described. A new process for registration of images to reduce the degrading effects of vibration and flexure of the airframe are developed and presented along with the underpinning BOS algorithm. Results show that jet flows can be visualised using this technique using a contained system on a single aircraft and demonstrate how a simple technique, such as BOS, can be democratised to such an extent that the cost of conducting in-flight jet measurements can be reduced to the budget of any model aircraft flyer.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Royal Aeronautical Society

Nomenclature

BOS

background-oriented schlieren

COTS

commercial off the shelf

DFT

discrete Fourier transform

ROI

region of interest

dpi

dots per inch

${{\mathcal I}_{OFF}}$

wind-off reference image

${{\mathcal I}_{ON}}$

wind-on image

$u$ , $u'$

horizontal displacement in pixels

$v$ , $v'$

vertical displacement in pixels

$\alpha $

under relaxation factor

1.0 Introduction

Visualisation of fluid flows using techniques such as schlieren and shadowgraph imaging or particle image velocimetry, have enabled great advances in the understanding of dynamics in transparent media. The use of these techniques in laboratory settings has been instrumental for decades; however, the majority of visualisation methods are more suitable for use in controlled laboratory settings. Implementing fluid visualisation methods in industrial settings, or even field trial settings, is extremely challenging and requires novel approaches [Reference Spoelstra, de Martino Norante, Terra, Sciacchitano and Scarano1Reference Hargather3]. One method of visualising fluid flows that has enjoyed success in industrial and field trial environments is background-oriented schlieren (BOS). This technique, first described Meier [Reference Meier4] and shortly after by Raffel [Reference Raffel, Richard and Meier5], generates schlieren-like results and enables the spatial visualisation of density gradients by tracking minuscule displacements in background patterns. These displacements are caused by gradients in refractive index (related to changes in density via the Gladstone-Dale equation) in the flow. Unlike regular schlieren, which requires precise alignment of mirrors, light sources and optics, BOS requires a camera and a background pattern with sufficient texture.

The BOS method visualises spatial density gradients by tracking displacements between image pairs. One image is a reference condition (often called a wind-off image), and the other image contains a flow to be visualised (known as a wind-on image). BOS results can be utilised in a variety of different ways from investigating flow patterns, quantifying jet spreading, estimating convective speeds, or calculating exhaust temperature as shown by Song et al. [Reference Song, Wu, Zhu, Xu, Li and Yu6]. The 2015 review by Raffel [Reference Raffel7] and the recent paper by Schwarz and Braukmann [Reference Schwarz and Braukmann8] are excellent descriptions of how to conduct BOS experiments and are recommended reading for all subsequent experimenters.

The BOS technique has significant mechanical simplicity advantages over traditional schlieren; however, this is at the cost of increased complexity of post processing. Post-processing of BOS images requires that displacements be calculated spatially across an image. The two common algorithms to achieve this displacement calculation are cross correlation (as used in particle image velocimetry) and optical flow. A comparison between these two main approaches has been conducted by several groups [Reference Atcheson, Heidrich and Ihrke9Reference Cakir, Lavagnoli, Saracoglu and Fureby12].

Advances in miniature camera systems (primarily driven through smartphone camera development) has enabled researchers to utilise machine vision systems for the purposes of flow visualisation [Reference Quinn, Spinosa and Roberts13Reference Lemarechal, Dimond, Barth, Hilfer and Klein17] in increasingly challenging scenarios where the mounting of typical laboratory camera systems would be prohibited by size and access restrictions. As the images captured by a BOS system do not have any special requirements (for example, low noise for quantitative intensity measurements), researchers building on the work of Aguirre-Pablo et al. [Reference Aguirre-Pablo, Alarfaj, Li, Hernández-Sánchez and Thoroddsen18] have developed BOS systems to work using smartphones by embedding the required processing algorithms in an app [Reference Hayasaka and Tagawa19, Reference Mercier, Hamidouche, Gautier and Lacassagne20].

The mechanical simplicity of BOS has enabled its application in a variety of different scenarios including large field measurements [Reference Trolinger, Buckner and L’Esperance21] tomographic reconstructions [Reference Nicolas, Donjat, Plyer, Champagnat, Le Besnerais, Micheli, Cornic, Le Sant and Deluc22], and the outstanding work of DLR [Reference Raffel, Richard and Meier5, Reference Bauknecht, Merz, Raffel, Landolt and Meier23, Reference Bauknecht, Merz and Raffel24] and NASA [Reference Smith, Heineck and Schairer10, Reference Heineck, Banks, Smith, Schairer, Bean and Robillos11] applying this method to full vehicles in field tests. Most field tests of BOS, or measurements of full vehicles, have involved the use of a natural background [Reference Heineck, Banks, Smith, Schairer, Bean and Robillos11, Reference Bauknecht, Merz, Raffel, Landolt and Meier23] and either a fixed camera position (meaning the aircraft must traverse the measurement field of view) or a chase aircraft which comes with its own complexity and significant cost.

Onboard BOS measurements were presented in the 2007 work of Leopold [Reference Leopold25], who mounted two cameras in close proximity to each other onboard a UH1D helicopter to image a natural background. Leopold used two scales of cross correlation in their processing, a very large window to estimate the translation between images and then a much finer one for extraction of the flow features before finally presenting the difference as the final result. As two images were captured simultaneously from the two onboard cameras, vibration effects were minimised; however, both images contain the flow, meaning features appeared twice. The onboard and in-flight use of BOS systems; however, is relatively underdeveloped despite some demonstrations of potential.

In this paper we demonstrate onboard and in-flight BOS measurements of a jet engine exhaust flow field. The aim is to provide a guide to the challenges and solutions for conducting such a measurement using consumer-grade hardware with a relatively flexible airframe subject to vibration and static aeroelastic deflection.

2.0 Hardware

2.1 Airframe

The case study aircraft used for the study was a twin-boom model jet aircraft designed and built by a team of academics and students at the University of Manchester [Reference Crowther26]. A general arrangement view of the airframe is shown in Fig. 1(a), and an image of the vehicle during flight taken from a tail mounted camera is shown in Fig. 1(b). The intended purpose of the aircraft was as an educational demonstrator for aircraft structures and flight control. The primary design requirements were that it should be jet powered, be easy to build, easy to fly and be operable within the UK CAA Open category for drones. The airframe primary structure was made entirely from foamboard apart from an engine and undercarriage mounting box which was made from plywood. All up mass (including 2 kg of fuel) was 22 kg, giving a wing loading of approximately 60 N/m ${{\rm{\;}}^2}$ . The stall speed is around 10 m/s and cruise speed 17 m/s. Automatic flight control was implemented via Arduplane running on pixhawk hardware.

Figure 1. Aircraft used for in-flight BOS demonstration experiments.

Figure 2. BOS arrangement on airframe.

2.2 Engine

Propulsion was provided by a Xicoy X-120 micro gas turbine (maximum thrust 120 N), giving a maximum thrust to weight ratio approximately 0.5. The jet exhaust temperature at max thrust is estimated to be around 625 ${{\rm{\;}}^ \circ }$ C from manufacturer data. Assuming the jet is expanded to atmospheric pressure, the density is approximately 0.38 kg/m ${{\rm{\;}}^3}$ and the jet exit velocity is 437 m/s based on the stated maximum thrust value and the nozzle exit area (nozzle diameter = 46 mm). At the estimated exhaust temperature, the speed of sound is 608 m/s giving an exit Mach number at maximum thrust of around 0.72.

2.3 Optics

An overview of the BOS hardware set up is shown in Fig. 2. A GoPro Hero 12 Black action camera was mounted to the starboard tailboom and aimed at a speckle pattern target on the port tail boom placed inline with the engine nozzle. Video was recorded at 60 fps with a resolution of 3,840 $ \times $ 3,360 with an exposure time decided by the auto-exposure algorithm on the GoPro. The shutter speed, extracted from the raw video metadata, varied through the experiment but was generally around a value of $1/2,000$ , or 500 ${\rm{\mu }}$ s. This value, although short enough to produce sharp images of the airframe and the background of the aircraft, is not short compared to the flow residency time based on the estimated jet velocity in section 2.2. As a result of this, the jet visualisation in flight can be considered as a short time average and will not be able to resolve transient turbulent fluctuations well.

The GoPro has a very short focal length lens providing a very wide-angle image as shown in Fig. 3(b). As a result, the speckle pattern only occupies a small region of the image (624 $ \times {\rm{\;}}$ 422 pixels), compromising the achievable resolution in the final data. The background speckle pattern was imaged with a spatial resolution of approximately 0.4 mm/pixel. As demonstrated by Schwarz and Braukmann [Reference Schwarz and Braukmann8], the sensitivity of a BOS system can be increased through increasing the focal length used. A modified GoPro or different lens and camera system would afford a better field of view and higher sensitivity.

Figure 3. Speckle pattern setup.

The BOS background image was generated using a two-level, multi-scale approach. A 4,000 $\; \times \;$ 4,000 matrix is initialised with random numbers with values greater than 0.6 being assigned white (a value of 1) and values lower than 0.6 being set to black (a value of 0). This background is then smoothed using a 15-pixel radius Gaussian filter. A second 4,000 $ \times {\rm{\;}}$ 4,000 matrix is initialised in the same way and is filtered with a smaller Gaussian filter and multiplied by 0.5 before being added to the original matrix. Finally, the resulting image passed through a contrast-limited adaptive histogram equaliser with a 1 percent clip limit. A 400 $ \times $ 400 pixel sample of the background is shown in Fig. 3(a). This background was then printed off on A4 sheets with a quality of 1,200 dpi.

3.0 Image processing

The output format from the GoPro is in the HEVC (H.265) codec, which is challenging to import into other software, so the video data was losslessly converted to MPEG4 using VLC media player. To reduce the data processing overhead, the frame rate was reduced to 30 fps.

Video data was read into MATLAB as colour frames, from which the green data channel was extracted for BOS processing. The wind-off reference condition was generated by averaging 100 frames at the start of the video before the jet engine startup sequence was initiated. The wind-on frames begin during engine idling, approximately 5 s before throttle up for take-off. The processing of the data composed of two main steps: image alignment, and BOS displacement estimation which will be described separately.

3.1 Image alignment

It is common that the BOS displacements caused by flows are often of the order of a pixel or below, but this is heavily dependent upon the resolution used, the geometry of the test, and the strength of the density gradients in the flow. Given these small displacements, images must be aligned to sub-pixel accuracy. During testing with the real-world aircraft, however, flexure of the airframe and vibration of the GoPro mount caused multi-pixel displacement of the speckle target, significantly overwhelming displacements caused by flow. Therefore, it was necessary to perform per-frame alignment of wind-on images against the reference wind-off image. The alignment was performed using a rectangular region of interest (ROI) selected in the jet exhaust corresponding to the speckle pattern in the raw image (Fig. 4). The image alignment was achieved using two discrete steps. Firstly, the ROI was aligned between the wind-on -off case using a discrete Fourier transform (DFT)-based registration algorithm [Reference Guizar-Sicairos, Thurman and Fienup27] that aligns images assuming pure translation in horizontal and vertical image axes. Figure 5 shows the estimated horizontal and vertical displacements, ${\rm{\Delta }}$ u and ${\rm{\Delta }}$ v between the wind-off and wind-on images. The displacement estimated between the ROI areas was subsequently used to translate the entire wind-on image towards the wind-off image. The DFT registration algorithm produced bulk displacements of the order of 10 pixels during flight, highlighting the severity of the challenge of image alignment (considering the flow produces displacements approximately $ \pm $ 0.5 pixels).

Figure 4. ROI (green) and CP (red) markers shown on wind-off image.

Figure 5. Output of DFT registration algorithm. At time 0-5 s the engine is idling. Time 5-11 s is approximately the take-off roll, and 11 s onwards can be considered in-flight.

The wind-off images were captured before the engine was ignited and the displacements seen at $t = 0$ s in Fig. 5 are a result of taxiing around the airfield and can be considered pre-flight. Starting from $t = 0$ s in Fig. 5, the aircraft is airborne at approximately 11 s and climbs until approximately 20 s before performing a series of banked turns followed by level flight thereby flying circuits around the airfield. Output from the onboard Arduplane logger is given in Fig. 6 showing acceleration and angular position data. In Fig. 6 the data are presented in standard aircraft coordinates (x being axially along the vehicle and z being nominally vertical).

Figure 6. Flight data log from Arduplane.

Comparing Figs 5 and 6(b) highlights some interesting features. At time $t = 39$ s, the aircraft experiences a vertical acceleration for approximately 1 second which appears in both figures implying that vertical acceleration significantly displaces the background pattern. A similar event takes place at approximately $t = 58$ s, which is also visible in the full BOS video.

During heavy vibration, and during manoeuvres in-flight, the displacement between wind-off and -on images was not pure translation, meaning the DFT algorithm alone was insufficient, hence the images require supplemental alignment. To achieve this alignment, the ROI was seeded with 96 control points (CP) evenly spaced in eight rows of 12 as shown in Fig. 4. These points were aligned using normalised cross correlation between wind-off and -on images to sub-pixel accuracy. The same wind-off points were used as a starting location in the wind-on image (post DFT alignment). Around these starting points, a 30-pixel window was used to find the correlating location in the wind-on image. These 96 points were then used to fit a projective transformation, allowing for rotation, scale, shear, translation and non-parallel shifting (tilting) to be accounted for.

Given that the DFT algorithm translates the wind-on images to within approximately 1 pixel of wind-off image, the control point based algorithm was successful without any intervention after setting the initial control point locations. In subsequent sections the images translated using only the DFT algorithm will be referred to as DFT-registered, and the images translated with the combined DFT and projective transformation will be referred to as CP-registered.

Algorithm 1. Iterative optical flow process

3.2 BOS processing

The BOS processing algorithm chosen for this study is based on optical flow to give the highest spatial resolution possible. Optical flow tracks intensity gradients between frames and assumes that they have translated across the image space whilst the overall brightness remains constant. The implementation utilised here is based on an iterative Lucas-Kanade [Reference Lucas and Kanade28] approach and was the same algorithm previously developed by Quinn et al. [Reference Quinn, Liu, Nabawy, Crowther and Weigert29]. A summary of the algorithm is presented below as Algorithm 1. After image alignment the horizontal and vertical displacement fields $u$ & $v$ are initialised to zero before calculation of optical flow ( ${\rm{OF}}$ ) between the wind-off ( ${{\mathcal I}_{{\rm{OFF}}}}$ ) and -on ( ${{\mathcal I}_{{\rm{ON}}}}$ ) image. In this study, a constant size 7 $ \times {\rm{\;}}$ 7 pixel stencil is used for the optical flow calculation. Processing attempts using a multi-resolution pyramidal scheme provided no significant improvement as the BOS-induced image displacements are all of the order of one pixel or lower. The optical flow calculation yields $u{\rm{'}}$ & $v{\rm{'}}$ which are scaled with an under-relaxation factor of $\delta = 0.75$ to smooth convergence of the final result. The under-relaxed values of $u{\rm{'}}$ & $v{\rm{'}}$ are added to $u$ & $v$ to provide the current estimate of the displacement between the images. This displacement field is then used to warp the wind-on image towards the wind-off image ( ${\rm{WARP}}$ in Algorithm 1). The optical flow between the wind-off image and the newly warped wind-on ( ${{\mathcal I}_{\rm{W}}}$ ) image is then calculated to give an updated estimate for the displacement fields $u{\rm{'}}$ and $v{\rm{'}}$ and the process is repeated. Ten ( $n = 10$ ) iterations were performed for each frame with an interim 7 $ \times \;$ 7 Gaussian filter with a standard deviation of 0.5 applied to the calculated displacement fields between each successive iteration. Finally, the $u$ and $v$ data was filtered with a 7 $ \times $ 7 median filter before presentation and overlay back onto the raw camera footage.

4.0 Results

The BOS algorithm was initially trialed without any image alignment using using data from a ground-based static thrust test, shown in Fig. 7. During throttle up of the engine, as excess fuel was burnt off, the constancy of brightness assumption in optical flow calculations was broken and the results in the jet core are unreliable as shown in Fig. 7(a). However, after this fuel was burnt away, the BOS results successfully showed a well-defined turbulent jet with strong mixing in Fig. 7(b).

Figure 7. Vertical displacement BOS data during static thrust test (BOS colourmap $ \pm 1.0$ pixels).

In flight there was significant airframe flexure and additional vibration due to aerodynamic forces and turbulence. Figure 8 shows vertical displacement BOS data at three different flight conditions, processed with three different image alignment approaches. Figure 8a, d, g) show no perceptible jet exhaust pattern. This result indicates that despite the success of a static ground test without image alignment (Fig. 7), image alignment is a requirement for in-flight testing. Additionally, a unique transformation is required for every image to be able to transform it back to the wind-off condition on the ground. Figure 8(b), (e) and (h) were processed with only the DFT-registration and Fig. 8(c), (f) and (i) were processed with the DFT+CP methods; both methods using the image alignment show similar results in that the jet outline is visible.

Figure 8. Cropped in-flight vertical displacement BOS data during take-off phase (a, b, c), in steady level flight (d, e, f), and during a steeply banked turn (g, h, i). Data processed with no image alignment (a, d, g), DFT-registered (b, e, h), and CP-registered alignment (c, f, i). All BOS data presented on ±1 pixel colourmap.

Figure 8(a), (b) and (c) is at 15 s from the beginning of the processed video and are representative of the take-off phase of flight. At this time, the background pattern appears rotated counter-clockwise by approximately 0.15 ${{\rm{\;}}^ \circ }$ . Figure 8(b), which presents vertical displacement from the optical flow algorithm, shows a large negative area on the left of the ROI and a positive area on the right, indicative of rotation. The CP-registered image (Fig. 8(c)) does not show any rotational artefacts.

During level flight, the rotational displacement of the speckle background was much lower than the take-off phase; however, it is non-zero, as can be seen in Fig. 8(e) and (f). This image was sampled at 60 s from the beginning of the video during benign conditions (Fig. 6). The CP-registered output shows a more uniform background outside of the core jet flow, implying that the CP-registration performs marginally better than the DFT algorithm alone.

The final sample result is 100 s into the video data during a steeply banked turn, shown in Fig. 8(g), (h), and (i). Even though the jet core is still visible, the DFT- and CP-registered BOS results both show high levels of noise, comparable to the amplitude of the signal from the jet exhaust flow. This can be attributed to warping of the background speckle pattern and its mount. The bending of the airframe flexes the attached speckle image in such a way that it cannot be registered back to the wind-off condition by an image-constant transformation alone. A piecewise polynomial approach may be useful in correcting this flex; however, other attempts to remedy this warping, such as a Gaussian pyramids or using a differential displacement calculation (such as Leopold [Reference Leopold25]), resulted in a strong attenuation of the jet displacements and little other reduction in noise. The method of using a very large stencil to remove background displacements would likely be more effective if the flow being investigated covered a smaller proportion of the BOS field of view. Utilising larger optical flow stencils (9 $ \times $ 9, 11 $ \times $ 11 and even larger) did improve the signal to noise ratio of the results, albeit with a compromise in spatial resolution. During subsequent level flight conditions, the BOS results return to the quality shown in Fig. 8(e) and (f), implying elastic bending of the airframe and the speckle pattern and mount during turns.

The flexibility of both the airframe and speckle pattern mounting does result in a slight crinkling deformation of the background pattern. The effect of this is visible in both the steady flight results (Fig. 8(e) and (f) and during the banked turn (Fig. 8(h) and (i). The faint vertical banding in the BOS results in these images is due to crinkling of the paper background and cannot be removed from the final data.

Comparing the flight log data and the BOS video demonstrates a strong correlation between low signal-to-noise ratio results, typified in Fig. 8(h) and (i), and high levels of roll during flight shown in Fig. 6(c). An asymmetrical load on the airframe during banked turns results in a distortion in the BOS background pattern which cannot be removed whilst still maintaining sensitivity sufficient to visualise the jet.

5.0 Conclusion

In-flight acquisition of background oriented schlieren data for a jet exhaust was demonstrated on a large model jet aircraft using non-optimised, affordable, commercial-off-the-shelf camera hardware. This case study provided a significant additional challenge associated with real-world testing in that target pixel displacements due to airframe flexibility was up to an order of magnitude larger than the displacement due to flow density changes being measured. Per-frame image registration was used to remove target pattern translation and rotation with minimal user input prior to application of optical flow processing. Successful reconstruction of the jet flow-field was achieved for all flight phases apart from steeply banked turning flight where significant asymmetric airframe loads led to distortion that could not be adequately corrected by the registration methods used. The results are encouraging in that useful in-flight data has been obtained relatively quickly and inexpensively using readily available hardware on a vehicle not specifically designed for flow-field flight testing. Use of a camera with narrower field of view and increased airframe rigidity would enable higher fidelity results to be undertaken; this might enable a detailed reconstruction of the temperature gradients of the jet and allow greater resolution of turbulent mixing processes.

Acknowledgements

The authors would like to acknowledge all the members of the Giant Foamboard Jet project team at the Aerospace Systems Laboratory at the University of Manchester for their contributions to building and flight testing of the aircraft used in this study. The authors also thanks and acknowledge the Ardupilot project http://www.ardupilot.org/ for the openly available flight control software.

References

Spoelstra, A., de Martino Norante, L., Terra, W., Sciacchitano, A. and Scarano, F. On-site cycling drag analysis with the ring of fire, Exp. Fluids, 2019, 60, pp 116.CrossRefGoogle Scholar
Hong, J., Toloui, M., Chamorro, L.P., Guala, M., Howard, K., Riley, S., Tucker, J. and Sotiropoulos, F. Natural snowfall reveals large-scale flow structures in the wake of a 2.5-mw wind turbine, Nat. Commun., 2014, 5, pp 19.CrossRefGoogle ScholarPubMed
Hargather, M.J. Background-oriented schlieren diagnostics for large-scale explosive testing, Shock Waves, 2013, 23, pp 529536.CrossRefGoogle Scholar
Meier, G. Hintergrund-schlierenmeßverfahren für räumliche dichtefelder, DLR, 1999.Google Scholar
Raffel, M., Richard, H. and Meier, G.E.A. On the applicability of background oriented optical tomography for large scale aerodynamic investigations, Exp. Fluids, 2000, 28, pp 477481.CrossRefGoogle Scholar
Song, F., Wu, J., Zhu, Y., Xu, H., Li, Y. and Yu, Z. Temperature field reconstruction method for aero engine exhaust using the colored background oriented schlieren technology, Optoelectron Lett., 2022, 18, pp 02430250.CrossRefGoogle Scholar
Raffel, M. Background-oriented schlieren (bos) techniques, Exp. Fluids, 2015, 56, pp 117.CrossRefGoogle Scholar
Schwarz, C. and Braukmann, J.N. Practical aspects of designing background-oriented schlieren (BOS) experiments for vortex measurements, Exp. Fluids, 2023, 64, pp 119.CrossRefGoogle Scholar
Atcheson, B., Heidrich, W., and Ihrke, I.. An evaluation of optical flow algorithms for background oriented schlieren imaging. Exp. Fluids, 46:467476, 3 2009.CrossRefGoogle Scholar
Smith, N.T., Heineck, J.T. and Schairer, E.T. Optical flow for flight and wind tunnel background oriented schlieren imaging, AIAA SciTech Forum, 2017.CrossRefGoogle Scholar
Heineck, J.T., Banks, D.W., Smith, N.T., Schairer, E.T., Bean, P.S. and Robillos, T.. Background-oriented schlieren imaging of supersonic aircraft in flight. AIAA J., 2021, 59, pp 1121.CrossRefGoogle Scholar
Cakir, B.O., Lavagnoli, S., Saracoglu, B.H. and Fureby, C. Assessment and application of optical flow in background-oriented schlieren for compressible flows, Exp. Fluids, 2023, 64, pp 120.CrossRefGoogle Scholar
Quinn, M.K., Spinosa, E., and Roberts, D.A. Miniaturisation of pressure-sensitive paint measurement systems using low-cost, miniaturised machine vision cameras, Sensors, 2017, 17, pp 121.CrossRefGoogle ScholarPubMed
Quinn, K.K. Binary pressure-sensitive paint measurements using miniaturised, colour, machine vision cameras, Measurement Science and Technology, 2018, 29.CrossRefGoogle Scholar
Özer, Ö. and Quinn, M.K. Novel particle image velocimetry methods for visualization of thrust reverser’s flow interactions, AIAA AVIATION 2022 Forum, 2022.CrossRefGoogle Scholar
Eagan, G.D., Lewis, C.J., Alles, R.M., Klingaman, K.C., Davenport, K., Gragston, M.T., Rice, B.E., Hamilton, M.C. and Thurow, B.S. Direct measurements of shock impingement in a busemann inlet via a miniature embedded imaging system, AIAA SCITECH 2023, 2023.CrossRefGoogle Scholar
Lemarechal, J., Dimond, B.D., Barth, H.P., Hilfer, M. and Klein, C. Miniaturization and model-integration of the optical measurement system for temperature-sensitive paint investigations. Sens, 2023, 23, (8), p. 7075.CrossRefGoogle ScholarPubMed
Aguirre-Pablo, A.A., Alarfaj, M.K., Li, E.Q., Hernández-Sánchez, J.F. and Thoroddsen, S.T. Tomographic particle image velocimetry using smartphones and colored shadows, Sci. Rep., 2017, 7.CrossRefGoogle ScholarPubMed
Hayasaka, K. and Tagawa, Y. Mobile visualization of density fields using smartphone background-oriented schlieren, Exp. Fluids, 2019, 60, pp 115.CrossRefGoogle Scholar
Mercier, B., Hamidouche, S., Gautier, R. and Lacassagne, T. Educational background oriented schlieren based on a Matlab app and a smartphone camera. In International Symposium on the Application of Laser and Imaging Techniques to Fluid Mechanics, vol. 20, 2022.CrossRefGoogle Scholar
Trolinger, J.D., Buckner, B., and L’Esperance, D. Background-oriented schlieren for the study of large flow fields, In Proc. SPIE 9576, Applied Advanced Optical Metrology Solutions, 2015.CrossRefGoogle Scholar
Nicolas, F., Donjat, D., Plyer, A., Champagnat, F., Le Besnerais, G., Micheli, F., Cornic, P., Le Sant, Y., and Deluc, J.M. Experimental study of a co-flowing jet in Onera’s F2 research wind tunnel by 3D background oriented schlieren, Meas. Sci. Technol., 2017, 28.CrossRefGoogle Scholar
Bauknecht, A., Merz, C.B., Raffel, M., Landolt, A. and Meier, A.H. Blade-tip vortex detection in maneuvering flight using the background-oriented schlieren technique, J. Aircr., 2014, 51.CrossRefGoogle Scholar
Bauknecht, A., Merz, C.B. and Raffel, M. Airborne visualization of helicopter blade tip vortices, J. Visualizat., 20, 2017, pp 139150.CrossRefGoogle Scholar
Leopold, F. The application of the colored background oriented schlieren technique (CBOS) to free-flight and in-flight measurements, In ICIASF Record, International Congress on Instrumentation in Aerospace Simulation Facilities, 2007.CrossRefGoogle Scholar
Crowther, W.J. We made the world’s first fully autonomous foamboard jet plane in 40 days, 2024. https://www.youtube.com/watch?v=0SfT4cCuhXg Google Scholar
Guizar-Sicairos, M., Thurman, S.T., and Fienup, J.R. Efficient subpixel image registration algorithms, Opt. Lett., 2008, 33, p 156.CrossRefGoogle ScholarPubMed
Lucas, B.D. and Kanade, T. An iterative image registration technique with an application to stereo vision, In 7th International Joint Conference on Artificial Intelligence, 1981.Google Scholar
Quinn, M.K., Liu, H., Nabawy, M.R.A., Crowther, W.J. and Weigert, S. Unsteady background oriented schlieren measurements at industry scales, in RAeS Applied Aerodynamics Conference 2024, 2024.Google Scholar
Figure 0

Figure 1. Aircraft used for in-flight BOS demonstration experiments.

Figure 1

Figure 2. BOS arrangement on airframe.

Figure 2

Figure 3. Speckle pattern setup.

Figure 3

Figure 4. ROI (green) and CP (red) markers shown on wind-off image.

Figure 4

Figure 5. Output of DFT registration algorithm. At time 0-5 s the engine is idling. Time 5-11 s is approximately the take-off roll, and 11 s onwards can be considered in-flight.

Figure 5

Figure 6. Flight data log from Arduplane.

Figure 6

Algorithm 1. Iterative optical flow process

Figure 7

Figure 7. Vertical displacement BOS data during static thrust test (BOS colourmap $ \pm 1.0$ pixels).

Figure 8

Figure 8. Cropped in-flight vertical displacement BOS data during take-off phase (a, b, c), in steady level flight (d, e, f), and during a steeply banked turn (g, h, i). Data processed with no image alignment (a, d, g), DFT-registered (b, e, h), and CP-registered alignment (c, f, i). All BOS data presented on ±1 pixel colourmap.