Project description:Monitoring traps are important components of integrated pest management applied against important fruit fly pests, including Bactrocera oleae (Gmelin) and Ceratitis capitata (Widemann), Diptera of the Tephritidae family, which effect a crop-loss/per year calculated in billions of euros worldwide. Pests can be controlled with ground pesticide sprays, the efficiency of which depends on knowing the time, location and extent of infestations as early as possible. Trap inspection is currently carried out manually, using the McPhail trap, and the mass spraying is decided based on a decision protocol. We introduce the term 'insect biometrics' in the context of entomology as a measure of a characteristic of the insect (in our case, the spectrum of its wingbeat) that allows us to identify its species and make devices to help face old enemies with modern means. We modify a McPhail type trap into becoming electronic by installing an array of photoreceptors coupled to an infrared emitter, guarding the entrance of the trap. The beating wings of insects flying in the trap intercept the light and the light fluctuation is turned to a recording. Custom-made electronics are developed that are placed as an external add-on kit, without altering the internal space of the trap. Counts from the trap are transmitted using a mobile communication network. This trap introduces a new automated remote-monitoring method different to audio and vision-based systems. We evaluate our trap in large number of insects in the laboratory by enclosing the electronic trap in insectary cages. Our experiments assess the potential of delivering reliable data that can be used to initialize reliably the spraying process at large scales but to also monitor the impact of the spraying process as it eliminates the time-lag between acquiring and delivering insect counts to a central agency.
Project description:Insect and pollinator populations are vitally important to the health of ecosystems, food production, and economic stability, but are declining worldwide. New, cheap, and simple monitoring methods are necessary to inform management actions and should be available to researchers around the world. Here, we evaluate the efficacy of a commercially available, close-focus automated camera trap to monitor insect-plant interactions and insect behavior. We compared two video settings-scheduled and motion-activated-to a traditional human observation method. Our results show that camera traps with scheduled video settings detected more insects overall than humans, but relative performance varied by insect order. Scheduled cameras significantly outperformed motion-activated cameras, detecting more insects of all orders and size classes. We conclude that scheduled camera traps are an effective and relatively inexpensive tool for monitoring interactions between plants and insects of all size classes, and their ease of accessibility and set-up allows for the potential of widespread use. The digital format of video also offers the benefits of recording, sharing, and verifying observations.
Project description:Raster-scan optoacoustic mesoscopy (RSOM), also termed photoacoustic mesoscopy, offers novel insights into vascular morphology and pathophysiological biomarkers of skin inflammation in vivo at depths unattainable by other optical imaging methods. Using ultra-wideband detection and focused ultrasound transducers, RSOM can achieve axial resolution of 4 micron and lateral resolution of 20 micron to depths of several millimeters. However, motion effects may deteriorate performance and reduce the effective resolution. To provide high-quality optoacoustic images in clinical measurements, we developed a motion correction algorithm for RSOM. The algorithm is based on observing disruptions of the ultrasound wave front generated by the vertical movement of the melanin layer at the skin surface. From the disrupted skin surface, a smooth synthetic surface is generated, and the offset between the two surfaces is used to correct for the relative position of the ultrasound detector. We test the algorithm in measurements of healthy and psoriatic human skin and achieve effective resolution up to 5-fold higher than before correction. We discuss the performance of the correction algorithm and its implications in the context of multispectral mesoscopy.
Project description:BackgroundThe recently described sensor-crosstalk error in the multiple-breath washout (MBW) device Exhalyzer D (Eco Medics AG) could highly influence clinimetric properties and the current interpretation of MBW results. This study reanalyzes MBW data from clinical routine in the corrected software version Spiroware® 3.3.1 and evaluates the effect on outcomes.MethodsWe included nitrogen-MBW data from healthy children and children with cystic fibrosis (CF) from previously published trials and ongoing cohort studies. We specifically compared lung clearance index (LCI) analyzed in Spiroware 3.2.1 and 3.3.1 with regard to (i) feasibility, (ii) repeatability, and (iii) validity as outcome parameters in children with CF.Results(i) All previously collected measurements could be reanalyzed and resulted in unchanged feasibility in Spiroware 3.3.1. (ii) Short- and midterm repeatability of LCI was similar in both software versions. (iii) Clinical validity of LCI remained similar in Spiroware 3.3.1; however, this resulted in lower values. Discrimination between health and disease was comparable between both software versions. The increase in LCI over time was less pronounced with 0.16 LCI units/year (95% confidence interval [CI] 0.08; 0.24) versus 0.30 LCI units/year (95% CI 0.21; 0.38) in 3.2.1. Response to intervention in children receiving CF transmembrane conductance-modulator therapy resulted in a comparable improvement in LCI, in both Spiroware versions.ConclusionOur study confirms that clinimetric properties of LCI remain unaffected after correction for the cross-sensitivity error in Spiroware software.
Project description:BackgroundCaptures of codling moth, Cydia pomonella (L.), in traps are used to establish action thresholds and time insecticide sprays. The need for frequent trap inspections in often remote orchards has created a niche for remote sensing smart traps. A smart trap baited with a five-component pheromone-kairomone blend was evaluated for codling moth monitoring among an assemblage of other nontargets in apple and pear orchards.ResultsCodling moth captures did not differ between the smart trap and a standard trap when both were checked manually. However, the correlation between automatic and manual counts of codling moth in the smart traps was low, R2 = 0.66 ÷ 0.87. False-negative identifications by the smart trap were infrequent <5%, but false-positive identifications accounted for up to 67% of the count. These errors were primarily due to the misidentification of three moth species of fairly similar-size to codling moth: apple clearwing moth Synanthedon myopaeformis (Borkhausen), oriental fruit moth Grapholita molesta (Busck), and carnation tortrix Cacoecimorpha pronubana (Hübner). Other false-positive counts were less frequent and included the misidentifications of dipterans, other arthropods, patches of moth scales, and the double counting of some moths.ConclusionCodling moth was successfully monitored remotely with a smart trap baited with a nonselective sex pheromone-kairomone lure, but automatic counts were inflated in some orchards due to mischaracterizations of primarily similar-sized nontarget moths. Improved image-identification algorithms are needed for smart traps baited with less-selective lures and with lure sets targeting multiple species.
Project description:Attention to Black Carbon (BC) has been rising due to its effects on human health as well its contribution to climate change. Measurements of BC are challenging, as currently used devices are either expensive or impractical for continuous monitoring. Here, we propose an optoacoustic sensor to address this problem. The sensor utilizes a novel ellipsoidal design for refocusing the optoacoustic signal with minimal acoustic energy losses. To reduce the cost of the system, without sacrificing accuracy, an overdriven laser diode and a Quartz Tuning Fork are used as the light source and the sound detector, respectively. The prototype was able to detect BC particles and to accurately monitor changes in concentration in real time and with very good agreement with a reference instrument. The response of the sensor was linearly dependent on the BC particles concentration with a normalized noise equivalent absorption coefficient (NNEA) for soot equal to 7.39 × 10-9 W cm-1 Hz-1/2. Finally, the prototype was able to perform NO2 measurements, demonstrating its ability to accurately monitor both particulate and gaseous pollutants. The proposed sensor has the potential to offer a significant economic impact for BC environmental measurements and source appointment technologies.
Project description:The performance of Signal Quality Monitoring (SQM) techniques under different multipath scenarios is analyzed. First, SQM variation profiles are investigated as critical requirements in evaluating the theoretical performance of SQM metrics. The sensitivity and effectiveness of SQM approaches for multipath detection and mitigation are then defined and analyzed by comparing SQM profiles and multipath error envelopes for different discriminators. Analytical discussions includes two discriminator strategies, namely narrow and high resolution correlator techniques for BPSK(1), and BOC(1,1) signaling schemes. Data analysis is also carried out for static and kinematic scenarios to validate the SQM profiles and examine SQM performance in actual multipath environments. Results show that although SQM is sensitive to medium and long-delay multipath, its effectiveness in mitigating these ranges of multipath errors varies based on tracking strategy and signaling scheme. For short-delay multipath scenarios, the multipath effect on pseudorange measurements remains mostly undetected due to the low sensitivity of SQM metrics.
Project description:Attractant-based trap networks are important elements of invasive insect detection, pest control, and basic research programs. We present a landscape-level, spatially explicit model of trap networks, focused on detection, that incorporates variable attractiveness of traps and a movement model for insect dispersion. We describe the model and validate its behavior using field trap data on networks targeting two species, Ceratitis capitata and Anoplophora glabripennis. Our model will assist efforts to optimize trap networks by 1) introducing an accessible and realistic mathematical characterization of the operation of a single trap that lends itself easily to parametrization via field experiments and 2) allowing direct quantification and comparison of sensitivity between trap networks. Results from the two case studies indicate that the relationship between number of traps and their spatial distribution and capture probability under the model is qualitatively dependent on the attractiveness of the traps, a result with important practical consequences.
Project description:Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.