Project description:Artificial tongues have been receiving increasing attention for the perception of five basic tastes. However, it is still challenging to fully mimic human tongue-like performance for tastes such as astringency. Mimicking the mechanism of astringency perception on the human tongue, we use a saliva-like chemiresistive ionic hydrogel anchored to a flexible substrate as a soft artificial tongue. When exposed to astringent compounds, hydrophobic aggregates form inside the microporous network and transform it into a micro/nanoporous structure with enhanced ionic conductivity. This unique human tongue-like performance enables tannic acid to be detected over a wide range (0.0005 to 1 wt %) with high sensitivity (0.292 wt %-1) and fast response time (~10 s). As a proof of concept, our sensor can detect the degree of astringency in beverages and fruits using a simple wipe-and-detection method, making a powerful platform for future applications involving humanoid robots and taste monitoring devices.
Project description:Wound closure with surgical sutures is a critical challenge for flexible endoscopic surgeries. Substantial efforts have been introduced to develop functional and smart surgical sutures to either monitor wound conditions or ease the complexity of knot tying. Although research interests in smart sutures by soft robotic technologies have emerged for years, it is challenging to develop a soft robotic structure that possesses a similar physical structure as conventional sutures while offering a self-tightening knot or anchor to close the wound. This paper introduces a new concept of smart sutures that can be programmed to achieve desired and uniform tension distribution while offering self-tightening knots or automatically deploying secured anchors. The core technology is a soft hydraulic artificial muscle that can be elongated and contracted under applied fluid pressure. Each suture is equipped with a pressure locking mechanism to hold its temporary elongated state and to induce self-shrinking ability. The puncturing and holding force for the smart sutures with anchors are examined. Ex-vivo experiments on fresh porcine stomach and colon demonstrate the usefulness of the new smart sutures. The new approaches are expected to pave the way for the further development of smart sutures that will benefit research, training, and commercialization in the surgical field.
Project description:We propose the use of bio-inspired robotics equipped with soft sensor technologies to gain a better understanding of the mechanics and control of animal movement. Soft robotic systems can be used to generate new hypotheses and uncover fundamental principles underlying animal locomotion and sensory capabilities, which could subsequently be validated using living organisms. Physical models increasingly include lateral body movements, notably back and tail bending, which are necessary for horizontal plane undulation in model systems ranging from fish to amphibians and reptiles. We present a comparative study of the use of physical modeling in conjunction with soft robotics and integrated soft and hyperelastic sensors to monitor local pressures, enabling local feedback control, and discuss issues related to understanding the mechanics and control of undulatory locomotion. A parallel approach combining live animal data with biorobotic physical modeling promises to be beneficial for gaining a better understanding of systems in motion.
Project description:The tumor microenvironment plays a crucial role in soft tissue sarcoma development and response to therapy. We used spatial transcriptomics to analyze the spatial distribution of malignant, immune, and other stromal cells present within soft tissue sarcomas.
Project description:Bat echolocation is a dynamic behavior that allows for real-time adaptations in the timing and spectro-temporal design of sonar signals in response to a particular task and environment. To enable detailed, quantitative analyses of adaptive sonar behavior, echolocation call design was investigated in big brown bats, trained to rest on a stationary platform and track a tethered mealworm that approached from a starting distance of about 170 cm in the presence of a stationary sonar distracter. The distracter was presented at different angular offsets and distances from the bat. The results of this study show that the distance and the angular offset of the distracter influence sonar vocalization parameters of the big brown bat, Eptesicus fuscus. Specifically, the bat adjusted its call duration to the closer of two objects, distracter or insect target, and the magnitude of the adjustment depended on the angular offset of the distracter. In contrast, the bat consistently adjusted its call rate to the distance of the insect, even when this target was positioned behind the distracter. The results hold implications for understanding spatial information processing and perception by echolocation.
Project description:We have proposed that haptic activation of the shape-selective lateral occipital complex (LOC) reflects a model of multisensory object representation in which the role of visual imagery is modulated by object familiarity. Supporting this, a previous functional magnetic resonance imaging (fMRI) study from our laboratory used inter-task correlations of blood oxygenation level-dependent (BOLD) signal magnitude and effective connectivity (EC) patterns based on the BOLD signals to show that the neural processes underlying visual object imagery (objIMG) are more similar to those mediating haptic perception of familiar (fHS) than unfamiliar (uHS) shapes. Here we employed fMRI to test a further hypothesis derived from our model, that spatial imagery (spIMG) would evoke activation and effective connectivity patterns more related to uHS than fHS. We found that few of the regions conjointly activated by spIMG and either fHS or uHS showed inter-task correlations of BOLD signal magnitudes, with parietal foci featuring in both sets of correlations. This may indicate some involvement of spIMG in HS regardless of object familiarity, contrary to our hypothesis, although we cannot rule out alternative explanations for the commonalities between the networks, such as generic imagery or spatial processes. EC analyses, based on inferred neuronal time series obtained by deconvolution of the hemodynamic response function from the measured BOLD time series, showed that spIMG shared more common paths with uHS than fHS. Re-analysis of our previous data, using the same EC methods as those used here, showed that, by contrast, objIMG shared more common paths with fHS than uHS. Thus, although our model requires some refinement, its basic architecture is supported: a stronger relationship between spIMG and uHS compared to fHS, and a stronger relationship between objIMG and fHS compared to uHS.
Project description:Soft robots driven by stimuli-responsive materials have their own unique advantages over traditional rigid robots such as large actuation, light weight, good flexibility and biocompatibility. However, the large actuation of soft robots inherently co-exists with difficulty in control with high precision. This article presents a soft artificial muscle driven robot mimicking cuttlefish with a fully integrated on-board system including power supply and wireless communication system. Without any motors, the movements of the cuttlefish robot are solely actuated by dielectric elastomer which exhibits muscle-like properties including large deformation and high energy density. Reinforcement learning is used to optimize the control strategy of the cuttlefish robot instead of manual adjustment. From scratch, the swimming speed of the robot is enhanced by 91% with reinforcement learning, reaching to 21?mm/s (0.38 body length per second). The design principle behind the structure and the control of the robot can be potentially useful in guiding device designs for demanding applications such as flexible devices and soft robots.
Project description:Congenitally blind infants are not only deprived of visual input but also of visual influences on the intact senses. The important role that vision plays in the early development of multisensory spatial perception1-7 (e.g., in crossmodal calibration8-10 and in the formation of multisensory spatial representations of the body and the world1,2) raises the possibility that impairments in spatial perception are at the heart of the wide range of difficulties that visually impaired infants show across spatial,8-12 motor,13-17 and social domains.8,18,19 But investigations of early development are needed to clarify how visually impaired infants' spatial hearing and touch support their emerging ability to make sense of their body and the outside world. We compared sighted (S) and severely visually impaired (SVI) infants' responses to auditory and tactile stimuli presented on their hands. No statistically reliable differences in the direction or latency of responses to auditory stimuli emerged, but significant group differences emerged in responses to tactile and audiotactile stimuli. The visually impaired infants showed attenuated audiotactile spatial integration and interference, weighted more tactile than auditory cues when the two were presented in conflict, and showed a more limited influence of representations of the external layout of the body on tactile spatial perception.20 These findings uncover a distinct phenotype of multisensory spatial perception in early postnatal visual deprivation. Importantly, evidence of audiotactile spatial integration in visually impaired infants, albeit to a lesser degree than in sighted infants, signals the potential of multisensory rehabilitation methods in early development. VIDEO ABSTRACT.
Project description:The appearance of visual objects varies substantially across the visual field. Could such spatial heterogeneity be due to undersampling of the visual field by neurons selective for stimulus categories? Here, we show that which parts of a bistable vase-face image observers perceive as figure and ground depends on the retinal location where the image appears. The spatial patterns of these perceptual biases were similar regardless of whether the images were upright or inverted. Undersampling by neurons tuned to an object class (e.g., faces) or variability in general local versus global processing cannot readily explain this spatial heterogeneity. Rather, these biases could result from idiosyncrasies in low-level sensitivity across the visual field.
Project description:The plant hormone abscisic acid (ABA) is best known for its role as a regulator of the responses to abiotic stressors. Components of ABA signaling pathway are thus considered as promising targets for securing yield under stress. ABA levels rise in response to abiotic stress, mounting a number of physiological and metabolic responses that promote plant survival under unfavorable conditions. ABA elicits its effects by binding to a family of soluble receptors called PYR/PYL/RCAR. Arabidopsis genome encodes 14 PYL proteins, knowledge on differential biological functions of these members is scarce. In this work, we took a gain-of-function approach to predict receptor-specific functionality. We introduced a set of activating mutations in the mobile gate and α-helix of sub family II ABA receptors. These mutations constitutively enforce the active ABA-bound conformation. We then transformed ABA deficient mutants with the constitutive receptors and monitored suppression of ABA deficiency phenotype. Our findings suggest that subfamily II receptors have differential activity in regulating transpiration and transcription of oxidation and stress response genes. These differences partly derived from differential accumulation of PYLs in the guard cells. Data mining revealed that only a small subset of PYL receptors are transcribed in the guard cells in a steady state. Because the guard cell specific receptors have different affinity to ABA and to the PP2C co-receptor we propose that, a combined partial receptor redundancy facilitate a gradual activation of the ABA output signal as its cellular concentration rises via biochemically different receptors.