<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Robotics &#8211; Pharmacy Update Online</title>
	<atom:link href="https://pharmacyupdateonline.com/category/devices-and-technology/robotics/feed/" rel="self" type="application/rss+xml" />
	<link>https://pharmacyupdateonline.com</link>
	<description></description>
	<lastBuildDate>Mon, 03 Oct 2022 15:36:59 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>Robotic drug capsule can deliver drugs to gut</title>
		<link>https://pharmacyupdateonline.com/2022/10/robotic-drug-capsule-can-deliver-drugs-to-gut/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Wed, 05 Oct 2022 08:00:11 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Medical Devices]]></category>
		<category><![CDATA[Pharmaceutical Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[biologic drugs]]></category>
		<category><![CDATA[gastroenterology]]></category>
		<category><![CDATA[nucleic acids]]></category>
		<category><![CDATA[protein drugs]]></category>
		<category><![CDATA[Robotic drug capsule]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=5172</guid>

					<description><![CDATA[One reason that it&#8217;s so difficult to deliver large protein drugs orally is that these drugs can&#8217;t pass through the mucus barrier that lines the digestive tract. This [&#8230;]]]></description>
										<content:encoded><![CDATA[<p id="first" class="lead">One reason that it&#8217;s so difficult to deliver large protein drugs orally is that these drugs can&#8217;t pass through the mucus barrier that lines the digestive tract. This means that insulin and most other &#8220;biologic drugs&#8221; &#8212; drugs consisting of proteins or nucleic acids &#8212; have to be injected or administered in a hospital.</p>
<div id="text">
<p>A new drug capsule developed at MIT may one day be able to replace those injections. The capsule has a robotic cap that spins and tunnels through the mucus barrier when it reaches the small intestine, allowing drugs carried by the capsule to pass into cells lining the intestine.</p>
<p>&#8220;By displacing the mucus, we can maximize the dispersion of the drug within a local area and enhance the absorption of both small molecules and macromolecules,&#8221; says Giovanni Traverso, the Karl van Tassel Career Development Assistant Professor of Mechanical Engineering at MIT and a gastroenterologist at Brigham and Women&#8217;s Hospital.</p>
<p>In a study appearing today in <em>Science Robotics</em>, the researchers demonstrated that they could use this approach to deliver insulin as well as vancomycin, an antibiotic peptide that currently has to be injected.</p>
<p>Shriya Srinivasan, a research affiliate at MIT&#8217;s Koch Institute for Integrative Cancer Research and a junior fellow at the Society of Fellows at Harvard University, is the lead author of the study.</p>
<p><strong>Tunneling through</strong></p>
<p>For several years, Traverso&#8217;s lab has been developing strategies to deliver protein drugs such as insulin orally. This is a difficult task because protein drugs tend to be broken down in acidic environment of the digestive tract, and they also have difficulty penetrating the mucus barrier that lines the tract.</p>
<p>To overcome those obstacles, Srinivasan came up with the idea of creating a protective capsule that includes a mechanism that can tunnel through mucus, just as tunnel boring machines drill into soil and rock.</p>
<p>&#8220;I thought that if we could tunnel through the mucus, then we could deposit the drug directly on the epithelium,&#8221; she says. &#8220;The idea is that you would ingest this capsule and the outer layer would dissolve in the digestive tract, exposing all these features that start to churn through the mucus and clear it.&#8221;</p>
<p>The &#8220;RoboCap&#8221; capsule, which is about the size of a multivitamin, carries its drug payload in a small reservoir at one end and carries the tunnelling features in its main body and surface. The capsule is coated with gelatin that can be tuned to dissolve at a specific pH.</p>
<p>When the coating dissolves, the change in pH triggers a tiny motor inside the RoboCap capsule to start spinning. This motion helps the capsule to tunnel into the mucus and displace it. The capsule is also coated with small studs that brush mucus away, similar to the action of a toothbrush.</p>
<p>The spinning motion also helps to erode the compartment that carries the drug, which is gradually released into the digestive tract.</p>
<p>&#8220;What the RoboCap does is transiently displace the initial mucus barrier and then enhance absorption by maximizing the dispersion of the drug locally,&#8221; Traverso says. &#8220;By combining all of these elements, we&#8217;re really maximizing our capacity to provide the optimal situation for the drug to be absorbed.&#8221;</p>
<p><strong>Enhanced delivery</strong></p>
<p>In tests in animals, the researchers used this capsule to deliver either insulin or vancomycin, a large peptide antibiotic that is used to treat a broad range of infections, including skin infections as well as infections affecting orthopedic implants. With the capsule, the researchers found that they could deliver 20 to 40 times more drug than a similar capsule without the tunneling mechanism.</p>
<p>Once the drug is released from the capsule, the capsule itself passes through the digestive tract on its own. The researchers found no sign of inflammation or irritation in the digestive tract after the capsule passed through, and they also observed that the mucus layer reforms within a few hours after being displaced by the capsule.</p>
<p>                                                                                                                                                                                                                                                                                                                                                                                       Another approach that some researchers have used to enhance oral delivery of drugs is to give them along with additional drugs that help them cross through the intestinal tissue. However, these enhancers often only work with certain drugs. Because the MIT team&#8217;s new approach relies solely on mechanical disruptions to the mucus barrier, it could potentially be applied to a broader set of drugs, Traverso says.</p>
<p>&#8220;Some of the chemical enhancers preferentially work with certain drug molecules,&#8221; he says. &#8220;Using mechanical methods of administration can potentially enable more drugs to have enhanced absorption.&#8221;</p>
<p>While the capsule used in this study released its payload in the small intestine, it could also be used to target the stomach or colon by changing the pH at which the gelatin coating dissolves. The researchers also plan to explore the possibility of delivering other protein drugs such as GLP1 receptor agonist, which is sometimes used to treat type 2 diabetes. The capsules could also be used to deliver topical drugs to treat ulcerative colitis and other inflammatory conditions by maximizing the local concentration of the drugs in the tissue to help treat the inflammation.</p>
<p>The research was funded in part by the National Institutes of Health and MIT&#8217;s Department of Mechanical Engineering.</p>
<p><strong>Journal Reference</strong>:</p>
<ol class="journal">
<li>Shriya S. Srinivasan, Amro Alshareef, Alexandria V. Hwang, Ziliang Kang, Johannes Kuosmanen, Keiko Ishida, Joshua Jenkins, Sabrina Liu, Wiam Abdalla Mohammed Madani, Jochen Lennerz, Alison Hayward, Josh Morimoto, Nina Fitzgerald, Robert Langer, Giovanni Traverso. <strong>RoboCap: Robotic mucus-clearing capsule for enhanced drug delivery in the gastrointestinal tract</strong>. <em>Science Robotics</em>, 2022; 7 (70) DOI: <a href="http://dx.doi.org/10.1126/scirobotics.abp9066" target="_blank" rel="noopener noreferrer nofollow">10.1126/scirobotics.abp9066</a></li>
</ol>
</div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Machine learning improves human speech recognition</title>
		<link>https://pharmacyupdateonline.com/2022/03/machine-learning-improves-human-speech-recognition/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Sun, 06 Mar 2022 10:00:52 +0000</pubDate>
				<category><![CDATA[Central Nervous System]]></category>
		<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Medicines and Therapeutics]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Acoustical Society of America]]></category>
		<category><![CDATA[ASR]]></category>
		<category><![CDATA[hearing aid]]></category>
		<category><![CDATA[hearing loss]]></category>
		<category><![CDATA[Machine learning]]></category>
		<category><![CDATA[signal-to-noise ratio]]></category>
		<category><![CDATA[speech recognition]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=2064</guid>

					<description><![CDATA[Hearing loss is a rapidly growing area of scientific research as the number of baby boomers dealing with hearing loss continues to increase as they age. To understand [&#8230;]]]></description>
										<content:encoded><![CDATA[<p id="first" class="lead">Hearing loss is a rapidly growing area of scientific research as the number of baby boomers dealing with hearing loss continues to increase as they age.</p>
<div id="text">
<p>To understand how hearing loss impacts people, researchers study people&#8217;s ability to recognize speech. It is more difficult for people to recognize human speech if there is reverberation, some hearing impairment, or significant background noise, such as traffic noise or multiple speakers.</p>
<p>As a result, hearing aid algorithms are often used to improve human speech recognition. To evaluate such algorithms, researchers perform experiments that aim to determine the signal-to-noise ratio at which a specific number of words (commonly 50%) are recognized. These tests, however, are time- and cost-intensive.</p>
<p>In <em>The Journal of the Acoustical Society of America</em>, published by the Acoustical Society of America through AIP Publishing, researchers from Germany explore a human speech recognition model based on machine learning and deep neural networks.</p>
<p>&#8220;The novelty of our model is that it provides good predictions for hearing-impaired listeners for noise types with very different complexity and shows both low errors and high correlations with the measured data,&#8221; said author Jana Roßbach, from Carl Von Ossietzky University.</p>
<p>The researchers calculated how many words per sentence a listener understands using automatic speech recognition (ASR). Most people are familiar with ASR through speech recognition tools like Alexa and Siri.</p>
<p>The study consisted of eight normal-hearing and 20 hearing-impaired listeners who were exposed to a variety of complex noises that mask the speech. The hearing-impaired listeners were categorized into three groups with different levels of age-related hearing loss.</p>
<p>The model allowed the researchers to predict the human speech recognition performance of hearing-impaired listeners with different degrees of hearing loss for a variety of noise maskers with increasing complexity in temporal modulation and similarity to real speech. The possible hearing loss of a person could be considered individually.</p>
<p>&#8220;We were most surprised that the predictions worked well for all noise types. We expected the model to have problems when using a single competing talker. However, that was not the case,&#8221; said Roßbach.</p>
<p>The model created predictions for single-ear hearing. Going forward, the researchers will develop a binaural model since understanding speech is impacted by two-ear hearing.</p>
<p>In addition to predicting speech intelligibility, the model could also potentially be used to predict listening effort or speech quality as these topics are very related.</p>
<p><strong>Journal Reference</strong>:</p>
<ol class="journal">
<li>Jana Roßbach, Birger Kollmeier, Bernd T. Meyer. <strong>A model of speech recognition for hearing-impaired listeners based on deep learning</strong>. <em>The Journal of the Acoustical Society of America</em>, 2022; 151 (3): 1417 DOI: <a href="http://dx.doi.org/10.1121/10.0009411" target="_blank" rel="nofollow noopener">10.1121/10.0009411</a></li>
</ol>
</div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Robot performs first laparoscopic surgery without human help</title>
		<link>https://pharmacyupdateonline.com/2022/02/robot-performs-first-laparoscopic-surgery-without-human-help/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Wed, 02 Feb 2022 10:00:11 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[automated surgery]]></category>
		<category><![CDATA[Johns Hopkins University]]></category>
		<category><![CDATA[keyhole surgery]]></category>
		<category><![CDATA[Laparoscopy]]></category>
		<category><![CDATA[Robot]]></category>
		<category><![CDATA[Smart Tissue Autonomous Robot]]></category>
		<category><![CDATA[surgery]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=1868</guid>

					<description><![CDATA[A robot has performed laparoscopic surgery on the soft tissue of a pig without the guiding hand of a human &#8212; a significant step in robotics toward fully [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>A robot has performed laparoscopic surgery on the soft tissue of a pig without the guiding hand of a human &#8212; a significant step in robotics toward fully automated surgery on humans. Designed by a team of Johns Hopkins University researchers, the Smart Tissue Autonomous Robot (STAR) is described today in <em>Science Robotics</em>.</p>
<p>&#8220;Our findings show that we can automate one of the most intricate and delicate tasks in surgery: the reconnection of two ends of an intestine. The STAR performed the procedure in four animals and it produced significantly better results than humans performing the same procedure,&#8221; said senior author Axel Krieger, an assistant professor of mechanical engineering at Johns Hopkins&#8217; Whiting School of Engineering.</p>
<p>The robot excelled at intestinal anastomosis, a procedure that requires a high level of repetitive motion and precision. Connecting two ends of an intestine is arguably the most challenging step in gastrointestinal surgery, requiring a surgeon to suture with high accuracy and consistency. Even the slightest hand tremor or misplaced stitch can result in a leak that could have catastrophic complications for the patient.</p>
<p>Working with collaborators at the Children&#8217;s National Hospital in Washington, D.C. and Jin Kang, a Johns Hopkins professor of electrical and computer engineering, Krieger helped create the robot, a vision-guided system designed specifically to suture soft tissue. Their current iteration advances a 2016 model that repaired a pig&#8217;s intestines accurately, but required a large incision to access the intestine and more guidance from humans.</p>
<p>The team equipped the STAR with new features for enhanced autonomy and improved surgical precision, including specialized suturing tools and state-of-the art imaging systems that provide more accurate visualizations of the surgical field.</p>
<p>Soft-tissue surgery is especially hard for robots because of its unpredictability, forcing them to be able to adapt quickly to handle unexpected obstacles, Krieger said. The STAR has a novel control system that can adjust the surgical plan in real time, just as a human surgeon would.</p>
<p>&#8220;What makes the STAR special is that it is the first robotic system to plan, adapt, and execute a surgical plan in soft tissue with minimal human intervention,&#8221; Krieger said.</p>
<p>A structural-light based three-dimensional endoscope and machine learning-based tracking algorithm developed by Kang and his students guides STAR. &#8220;We believe an advanced three-dimensional machine vision system is essential in making intelligent surgical robots smarter and safer,&#8221; Kang said.</p>
<p>As the medical field moves towards more laparoscopic approaches for surgeries, it will be important to have an automated robotic system designed for such procedures to assist, Krieger said.</p>
<p>&#8220;Robotic anastomosis is one way to ensure that surgical tasks that require high precision and repeatability can be performed with more accuracy and precision in every patient independent of surgeon skill,&#8221; Krieger said. &#8220;We hypothesize that this will result in a democratized surgical approach to patient care with more predictable and consistent patient outcomes.&#8221;</p>
<p>The team from Johns Hopkins also included Hamed Saeidi, Justin D. Opfermann, Michael Kam, Shuwen Wei, and Simon Leonard. Michael H. Hsieh, director of Transitional Urology at Children&#8217;s National Hospital, also contributed to the research.</p>
<p>The work was supported by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health under award numbers 1R01EB020610 and R21EB024707.</p>
<p><strong>Journal Reference</strong>:</p>
<ol>
<li>Saeidi, J. D. Opfermann, M. Kam, S. Wei, S. Leonard, M. H. Hsieh, J. U. Kang, A. Krieger. <strong>Autonomous robotic laparoscopic surgery for intestinal anastomosis</strong>. <em>Science Robotics</em>, 2022; 7 (62) DOI: <a href="http://dx.doi.org/10.1126/scirobotics.abj2908">10.1126/scirobotics.abj2908</a></li>
</ol>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>A visit from a social robot improves hospitalized children’s outlook</title>
		<link>https://pharmacyupdateonline.com/2021/10/a-visit-from-a-social-robot-improves-hospitalized-childrens-outlook/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Wed, 20 Oct 2021 10:00:55 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[paediatrics]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[social robot]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=1385</guid>

					<description><![CDATA[A new study from UCLA finds a visit from human-controlled robot encourages a positive outlook and improves medical interactions for hospitalized children. Robin is a social companion robot [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>A new study from UCLA finds a visit from human-controlled robot encourages a positive outlook and improves medical interactions for hospitalized children.</p>
<p>Robin is a social companion robot that stands at about 4 feet tall and has the capabilities to move, talk and play with others while being remotely controlled by humans. Specialists from UCLA Mattel Children&#8217;s Hospital&#8217;s Chase Child Life Program conducted hour-long video visits with young patients using Robin, comparing it to interactions using a standard tablet, from October 2020 to April 2021. At the conclusion of the study period, children and their parents were interviewed about their experiences and child life specialists provided feedback in a focus group. Researchers then used a transcript of the discussion to identify recurrent and salient themes.</p>
<p>Ninety percent of parents who had a visit with Robin indicated they were &#8220;extremely likely&#8221; to request another visit, compared to 60% of parents whose children interacted with the tablet. Children reported a 29% increase in positive affect &#8212; described as the tendency to experience the world in a positive way, including emotions, interactions with others and with life&#8217;s challenges &#8212; after a visit with Robin and a 33% decrease in negative affect. Children who had a tablet visit reported a 43% decrease in positive affect and a 33% decrease in negative affect.</p>
<p>Parents whose children had a visit from Robin reported their children had no change in positive affect and a 75% decrease in negative affect. Parents whose children had a tablet visit reported their children had a 16% increase in positive affect and no change in negative affect.</p>
<p>The study is being presented on October 11 at the American Academy of Pediatrics (AAP) National Conference.</p>
<p>Child life specialists who oversaw visits with Robin reported benefits that included a greater display of intimacy and interactivity during play, increased control over their hospital experience and the formation of a new, trusting friendship.</p>
<p>&#8220;Our team has demonstrated that a social companion robot can go beyond video chats on a tablet to give us a more imaginative and profound way to make the hospital less stressful,&#8221; said Justin Wagner, MD, a pediatric surgeon at UCLA Mattel Children&#8217;s Hospital and senior author of the study. &#8220;As the pandemic continues, our patients are still feeling anxious and vulnerable in a variety of ways, so it&#8217;s critical that we be as creative as possible to make their experiences easier when they need our help.&#8221;</p>
<p>&#8220;We saw the positive effect in children, their families and healthcare workers,&#8221; adds Wagner. The analysis also suggests benefits to staff, including an increased sense of intimacy with and focus on the patient, increased staff engagement in social care and relative ease in maintaining infection control practices.</p>
<p>In the study, child life specialists also reported the challenges of limited time for patient encounters and a learning curve for operating Robin.</p>
<p>The authors say the evidence illustrates benefits for young patients and supports the incorporation of a social robot like Robin in an inpatient pediatric multidisciplinary care setting.</p>
<p>The study&#8217;s other authors are Dr. Gabriel Oland, Joseph Wertz, W. Scott Comulada, Valentina Ogaryan, Megan Pike, and Dr. Shant Shekherdimian of UCLA.</p>
<p>University of California &#8211; Los Angeles Health Sciences. &#8220;A visit from a social robot improves hospitalized children’s outlook: Findings suggest robot telepresence, more than a tablet, provides comfort to young patients.&#8221; ScienceDaily. ScienceDaily, 9 October 2021. &lt;www.sciencedaily.com/releases/2021/10/211009093146.htm&gt;.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Do Alexa and Siri make kids bossier? New research suggests you might not need to worry</title>
		<link>https://pharmacyupdateonline.com/2021/09/do-alexa-and-siri-make-kids-bossier-new-research-suggests-you-might-not-need-to-worry/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Tue, 28 Sep 2021 10:00:55 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Alexa]]></category>
		<category><![CDATA[paediatrics]]></category>
		<category><![CDATA[Siri]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=1243</guid>

					<description><![CDATA[University of Washington. Chatting with a robot is now part of many families&#8217; daily lives, thanks to conversational agents such as Apple&#8217;s Siri or Amazon&#8217;s Alexa. Recent research [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>University of Washington.</p>
<p>Chatting with a robot is now part of many families&#8217; daily lives, thanks to conversational agents such as Apple&#8217;s Siri or Amazon&#8217;s Alexa. Recent research has shown that children are often delighted to find that they can ask Alexa to play their favorite songs or call Grandma.</p>
<p>But does hanging out with Alexa or Siri affect the way children communicate with their fellow humans? Probably not, according to a recent study led by the University of Washington that found that children are sensitive to context when it comes to these conversations.</p>
<p>The team had a conversational agent teach 22 children between the ages of 5 and 10 to use the word &#8220;bungo&#8221; to ask it to speak more quickly. The children readily used the word when a robot slowed down its speech. While most children did use <em>bungo</em> in conversations with their parents, it became a source of play or an inside joke about acting like a robot. But when a researcher spoke slowly to the children, the kids rarely used <em>bungo</em>, and often patiently waited for the researcher to finish talking before responding.</p>
<p>The researchers published their findings in June at the 2021 Interaction Design and Children conference.</p>
<p>&#8220;We were curious to know whether kids were picking up conversational habits from their everyday interactions with Alexa and other agents,&#8221; said senior author Alexis Hiniker, a UW assistant professor in the Information School. &#8220;A lot of the existing research looks at agents designed to teach a particular skill, like math. That&#8217;s somewhat different from the habits a child might incidentally acquire by chatting with one of these things.&#8221;</p>
<p>The researchers recruited 22 families from the Seattle area to participate in a five-part study. This project took place before the COVID-19 pandemic, so each child visited a lab with one parent and one researcher. For the first part of the study, children spoke to a simple animated robot or cactus on a tablet screen that also displayed the text of the conversation.</p>
<p>On the back end, another researcher who was not in the room asked each child questions, which the app translated into a synthetic voice and played for the child. The researcher listened to the child&#8217;s responses and reactions over speakerphone.</p>
<p>At first, as children spoke to one of the two conversational agents (the robot or the cactus), it told them: &#8220;When I&#8217;m talking, sometimes I begin to speak very slowly. You can say &#8216;bungo&#8217; to remind me to speak quickly again.&#8221;</p>
<p>After a few minutes of chatting with a child, the app switched to a mode where it would periodically slow down the agent&#8217;s speech until the child said &#8220;bungo.&#8221; Then the researcher pressed a button to immediately return the agent&#8217;s speech to normal speed. During this session, the agent reminded the child to use <em>bungo</em> if needed. The conversation continued until the child had practiced using <em>bungo</em> at least three times.</p>
<p>The majority of the children, 64%, remembered to use <em>bungo</em> the first time the agent slowed its speech, and all of them learned the routine by the end of this session.</p>
<p>Then the children were introduced to the other agent. This agent also started to periodically speak slowly after a brief conversation at normal speed. While the agent&#8217;s speech also returned to normal speed once the child said &#8220;bungo,&#8221; this agent did not remind them to use that word. Once the child said &#8220;bungo&#8221; five times or let the agent continue speaking slowly for five minutes, the researcher in the room ended the conversation.</p>
<p>By the end of this session, 77% of the children had successfully used <em>bungo</em> with this agent.</p>
<p>At this point, the researcher in the room left. Once alone, the parent chatted with the child and then, as with the robot and the cactus, randomly started speaking slowly. The parent didn&#8217;t give any reminders about using the word <em>bungo</em>.</p>
<p>Only 19 parents conducted this part of the study. Of the children who completed this part, 68% used <em>bungo</em> in conversation with their parents. Many of them used it with affection. Some children did so enthusiastically, often cutting their parents off in mid-sentence. Others expressed hesitation or frustration, asking their parents why they were acting like robots.</p>
<p>When the researcher returned, they had a similar conversation with the child: normal at first, followed by slower speech. In this situation, only 18% of the 22 children used <em>bungo</em> with the researcher. None of them commented on the researcher&#8217;s slow speech, though some of them made knowing eye contact with their parents.</p>
<p>&#8220;The kids showed really sophisticated social awareness in their transfer behaviors,&#8221; Hiniker said. &#8220;They saw the conversation with the second agent as a place where it was appropriate to use the word <em>bungo</em>. With parents, they saw it as a chance to bond and play. And then with the researcher, who was a stranger, they instead took the socially safe route of using the more traditional conversational norm of not interrupting someone who&#8217;s talking to you.&#8221;</p>
<p>After this session in the lab, the researchers wanted to know how <em>bungo</em> would fare &#8220;in the wild,&#8221; so they asked parents to try slowing down their speech at home over the next 24 hours.</p>
<p>Of the 20 parents who tried this at home, 11 reported that the children continued to use <em>bungo</em>. These parents described the experiences as playful, enjoyable and &#8220;like an inside joke.&#8221; For the children who expressed skepticism in the lab, many continued that behavior at home, asking their parents to stop acting like robots or refusing to respond.</p>
<p>&#8220;There is a very deep sense for kids that robots are not people, and they did not want that line blurred,&#8221; Hiniker said. &#8220;So for the children who didn&#8217;t mind bringing this interaction to their parents, it became something new for them. It wasn&#8217;t like they were starting to treat their parent like a robot. They were playing with them and connecting with someone they love.&#8221;</p>
<p>Although these findings suggest that children will treat Siri differently from the way they treat people, it&#8217;s still possible that conversations with an agent might subtly influence children&#8217;s habits &#8212; such as using a particular type of language or conversational tone &#8212; when they speak to other people, Hiniker said.</p>
<p>But the fact that many kids wanted to try out something new with their parents suggests that designers could create shared experiences like this to help kids learn new things.</p>
<p>&#8220;I think there&#8217;s a great opportunity here to develop educational experiences for conversational agents that kids can try out with their parents. There are so many conversational strategies that can help kids learn and grow and develop strong interpersonal relationships, such as labeling your feelings, using &#8216;I&#8217; statements or standing up for others,&#8221; Hiniker said. &#8220;We saw that kids were excited to playfully practice a conversational interaction with their parent after they learned it from a device. My other takeaway for parents is not to worry. Parents know their kid best and have a good sense of whether these sorts of things shape their own child&#8217;s behavior. But I have more confidence after running this study that kids will do a good job of differentiating between devices and people.&#8221;</p>
<p>Other co-authors on this paper are Amelia Wang and Jonathan Tran, both of whom completed this research as UW undergraduate students majoring in human centered design and engineering; Mingrui Zhang, a UW doctoral student in the iSchool; Jenny Radesky, an assistant professor at the University of Michigan Medical School; Kiley Sobel, a senior user experience researcher at Duolingo who previously received a doctorate degree from the UW; and Sunsoo Ray Hong, an assistant professor at George Mason University. This research was funded by a Jacobs Foundation Early Career Fellowship.</p>
<ol>
<li>Alexis Hiniker, Amelia Wang, Jonathan Tran, Mingrui Ray Zhang, Jenny Radesky, Kiley Sobel, Sungsoo Ray Hong. <strong>Can Conversational Agents Change the Way Children Talk to People?</strong><em>Interaction Design and Children conference</em>, 2021 DOI: <a href="http://dx.doi.org/10.1145/3459990.3460695">1145/3459990.3460695</a></li>
</ol>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Inflatable robotic hand gives amputees real-time tactile control</title>
		<link>https://pharmacyupdateonline.com/2021/08/inflatable-robotic-hand-gives-amputees-real-time-tactile-control/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Sat, 28 Aug 2021 10:00:08 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Medical Devices]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[amputees]]></category>
		<category><![CDATA[Inflatable robotic hand]]></category>
		<category><![CDATA[medical device]]></category>
		<category><![CDATA[robotics]]></category>
		<category><![CDATA[tactile control]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=1126</guid>

					<description><![CDATA[Prosthetic enables a wide range of daily activities, such as zipping a suitcase, shaking hands, and petting a cat. Massachusetts Institute of Technology An MIT-developed inflatable robotic hand [&#8230;]]]></description>
										<content:encoded><![CDATA[<p><strong>Prosthetic enables a wide range of daily activities, such as zipping a suitcase, shaking hands, and petting a cat.</strong></p>
<p>Massachusetts Institute of Technology</p>
<p>An MIT-developed inflatable robotic hand gives amputees real-time tactile control. The smart hand is soft and elastic, weighs about half a pound, and costs a fraction of comparable prosthetics.</p>
<p>But this high-tech dexterity comes at a price. Neuroprosthetics can cost tens of thousands of dollars and are built around metal skeletons, with electrical motors that can be heavy and rigid.</p>
<p>Now engineers at MIT and Shanghai Jiao Tong University have designed a soft, lightweight, and potentially low-cost neuroprosthetic hand. Amputees who tested the artificial limb performed daily activities, such as zipping a suitcase, pouring a carton of juice, and petting a cat, just as well as &#8212; and in some cases better than &#8212; those with more rigid neuroprosthetics.</p>
<p>The researchers found the prosthetic, designed with a system for tactile feedback, restored some primitive sensation in a volunteer&#8217;s residual limb. The new design is also surprisingly durable, quickly recovering after being struck with a hammer or run over with a car.</p>
<p>The smart hand is soft and elastic, and weighs about half a pound. Its components total around $500 &#8212; a fraction of the weight and material cost associated with more rigid smart limbs.</p>
<p>&#8220;This is not a product yet, but the performance is already similar or superior to existing neuroprosthetics, which we&#8217;re excited about,&#8221; says Xuanhe Zhao, professor of mechanical engineering and of civil and environmental engineering at MIT. &#8220;There&#8217;s huge potential to make this soft prosthetic very low cost, for low-income families who have suffered from amputation.&#8221;</p>
<p>Zhao and his colleagues have published their work today in <em>Nature Biomedical Engineering</em>. Co-authors include MIT postdoc Shaoting Lin, along with Guoying Gu, Xiangyang Zhu, and collaborators at Shanghai Jiao Tong University in China.</p>
<p><strong>Big Hero hand</strong></p>
<p>The team&#8217;s pliable new design bears an uncanny resemblance to a certain inflatable robot in the animated film &#8220;Big Hero 6.&#8221; Like the squishy android, the team&#8217;s artificial hand is made from soft, stretchy material &#8212; in this case, the commercial elastomer EcoFlex. The prosthetic comprises five balloon-like fingers, each embedded with segments of fiber, similar to articulated bones in actual fingers. The bendy digits are connected to a 3-D-printed &#8220;palm,&#8221; shaped like a human hand.</p>
<p>Rather than controlling each finger using mounted electrical motors, as most neuroprosthetics do, the researchers used a simple pneumatic system to precisely inflate fingers and bend them in specific positions. This system, including a small pump and valves, can be worn at the waist, significantly reducing the prosthetic&#8217;s weight.</p>
<p>Lin developed a computer model to relate a finger&#8217;s desired position to the corresponding pressure a pump would have to apply to achieve that position. Using this model, the team developed a controller that directs the pneumatic system to inflate the fingers, in positions that mimic five common grasps, including pinching two and three fingers together, making a balled-up fist, and cupping the palm.</p>
<p>The pneumatic system receives signals from EMG sensors &#8212; electromyography sensors that measure electrical signals generated by motor neurons to control muscles. The sensors are fitted at the prosthetic&#8217;s opening, where it attaches to a user&#8217;s limb. In this arrangement, the sensors can pick up signals from a residual limb, such as when an amputee imagines making a fist.</p>
<p>The team then used an existing algorithm that &#8220;decodes&#8221; muscle signals and relates them to common grasp types. They used this algorithm to program the controller for their pneumatic system. When an amputee imagines, for instance, holding a wine glass, the sensors pick up the residual muscle signals, which the controller then translates into corresponding pressures. The pump then applies those pressures to inflate each finger and produce the amputee&#8217;s intended grasp.</p>
<p>Going a step further in their design, the researchers looked to enable tactile feedback &#8212; a feature that is not incorporated in most commercial neuroprosthetics. To do this, they stitched to each fingertip a pressure sensor, which when touched or squeezed produces an electrical signal proportional to the sensed pressure. Each sensor is wired to a specific location on an amputee&#8217;s residual limb, so the user can &#8220;feel&#8221; when the prosthetic&#8217;s thumb is pressed, for example, versus the forefinger.</p>
<p><strong>Good grip</strong></p>
<p>To test the inflatable hand, the researchers enlisted two volunteers, each with upper-limb amputations. Once outfitted with the neuroprosthetic, the volunteers learned to use it by repeatedly contracting the muscles in their arm while imagining making five common grasps.</p>
<p>After completing this 15-minute training, the volunteers were asked to perform a number of standardized tests to demonstrate manual strength and dexterity. These tasks included stacking checkers, turning pages, writing with a pen, lifting heavy balls, and picking up fragile objects like strawberries and bread. They repeated the same tests using a more rigid, commercially available bionic hand and found that the inflatable prosthetic was as good, or even better, at most tasks, compared to its rigid counterpart.</p>
<p>One volunteer was also able to intuitively use the soft prosthetic in daily activities, for instance to eat food like crackers, cake, and apples, and to handle objects and tools, such as laptops, bottles, hammers, and pliers. This volunteer could also safely manipulate the squishy prosthetic, for instance to shake someone&#8217;s hand, touch a flower, and pet a cat.</p>
<p>In a particularly exciting exercise, the researchers blindfolded the volunteer and found he could discern which prosthetic finger they poked and brushed. He was also able to &#8220;feel&#8221; bottles of different sizes that were placed in the prosthetic hand, and lifted them in response. The team sees these experiments as a promising sign that amputees can regain a form of sensation and real-time control with the inflatable hand.</p>
<p>The team has filed a patent on the design, through MIT, and is working to improve its sensing and range of motion.</p>
<p>&#8220;We now have four grasp types. There can be more,&#8221; Zhao says. &#8220;This design can be improved, with better decoding technology, higher-density myoelectric arrays, and a more compact pump that could be worn on the wrist. We also want to customize the design for mass production, so we can translate soft robotic technology to benefit society.&#8221;</p>
<p>Video: <a href="https://www.youtube.com/watch?v=p1d8i2lwuFw&amp;t=20s">https://www.youtube.com/watch?v=p1d8i2lwuFw&amp;t=20s</a></p>
<p><strong>Journal Reference</strong>:</p>
<ol>
<li>Guoying Gu, Ningbin Zhang, Haipeng Xu, Shaoting Lin, Yang Yu, Guohong Chai, Lisen Ge, Houle Yang, Qiwen Shao, Xinjun Sheng, Xiangyang Zhu, Xuanhe Zhao. <strong>A soft neuroprosthetic hand providing simultaneous myoelectric control and tactile feedback</strong>. <em>Nature Biomedical Engineering</em>, 2021; DOI: <a href="http://dx.doi.org/10.1038/s41551-021-00767-0">10.1038/s41551-021-00767-0</a></li>
</ol>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Brain stimulation evoking sense of touch improves control of robotic arm</title>
		<link>https://pharmacyupdateonline.com/2021/05/brain-stimulation-evoking-sense-of-touch-improves-control-of-robotic-arm/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Sat, 29 May 2021 10:00:04 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[Brain stimulation]]></category>
		<category><![CDATA[prosthetic]]></category>
		<category><![CDATA[robotic arm]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=753</guid>

					<description><![CDATA[Most able-bodied people take their ability to perform simple daily tasks for granted &#8212; when they reach for a warm mug of coffee, they can feel its weight [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>Most able-bodied people take their ability to perform simple daily tasks for granted &#8212; when they reach for a warm mug of coffee, they can feel its weight and temperature and adjust their grip accordingly so that no liquid is spilled. People with full sensory and motor control of their arms and hands can feel that they&#8217;ve made contact with an object the instant they touch or grasp it, allowing them to start moving or lifting it with confidence.</p>
<p>But those tasks become much more difficult when a person operates a prosthetic arm, let alone a mind-controlled one.</p>
<p>In a paper published today in <em>Science</em>, a team of bioengineers from the University of Pittsburgh Rehab Neural Engineering Labs describe how adding brain stimulation that evokes tactile sensations makes it easier for the operator to manipulate a brain-controlled robotic arm. In the experiment, supplementing vision with artificial tactile perception cut the time spent grasping and transferring objects in half, from a median time of 20.9 to 10.2 seconds.</p>
<p>&#8220;In a sense, this is what we hoped would happen &#8212; but perhaps not to the degree that we observed,&#8221; said co-senior author Jennifer Collinger, Ph.D., associate professor in the Pitt Department of Physical Medicine and Rehabilitation. &#8220;Sensory feedback from limbs and hands is hugely important for doing normal things in our daily lives, and when that feedback is lacking, people&#8217;s performance is impaired.&#8221;</p>
<p>Study participant Nathan Copeland, whose progress was described in the paper, is the first person in the world who was implanted with tiny electrode arrays not just in his brain&#8217;s motor cortex but in his somatosensory cortex as well &#8212; a region of the brain that processes sensory information from the body. Arrays allow him to not only control the robotic arm with his mind, but also to receive tactile sensory feedback, which is similar to how neural circuits operate when a person&#8217;s spinal cord is intact.</p>
<p>&#8220;I was already extremely familiar with both the sensations generated by stimulation and performing the task without stimulation. Even though the sensation isn&#8217;t &#8216;natural&#8217; &#8212; it feels like pressure and gentle tingle &#8212; that never bothered me,&#8221; said Copeland. &#8220;There wasn&#8217;t really any point where I felt like stimulation was something I had to get used to. Doing the task while receiving the stimulation just went together like PB&amp;J.&#8221;</p>
<p>After a car crash that left him with limited use of his arms, Copeland enrolled in a clinical trial testing the sensorimotor microelectrode brain-computer interface (BCI) and was implanted with four microelectrode arrays developed by Blackrock Microsystems (also commonly referred to as Utah arrays).</p>
<p>This paper is a step forward from an earlier study that described for the first time how stimulating sensory regions of the brain using tiny electrical pulses can evoke sensation in distinct regions of a person&#8217;s hand, even though they lost feeling in their limbs due to spinal cord injury. In this new study, the researchers combined reading the information from the brain to control the movement of the robotic arm with writing information back in to provide sensory feedback.</p>
<p>In a series of tests, where the BCI operator was asked to pick up and transfer various objects from a table to a raised platform, providing tactile feedback through electrical stimulation allowed the participant to complete tasks twice as fast compared to tests without stimulation.</p>
<p>In the new paper, the researchers wanted to test the effect of sensory feedback in conditions that would resemble the real world as closely as possible.</p>
<p>&#8220;We didn&#8217;t want to constrain the task by removing the visual component of perception,&#8221; said co-senior author Robert Gaunt, Ph.D., associate professor in the Pitt Department of Physical Medicine and Rehabilitation. &#8220;When even limited and imperfect sensation is restored, the person&#8217;s performance improved in a pretty significant way. We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people&#8217;s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be.&#8221;</p>
<p>This work was supported by the Defense Advanced Research Projects Agency (DARPA) and Space and Naval Warfare Systems Center Pacific (SSC Pacific) under Contract No. N66001-16-C-4051 and the Revolutionizing Prosthetics program (Contract No. N66001-10-C-4056).</p>
<p>&nbsp;</p>
<p><strong>Journal Reference</strong>:</p>
<ol>
<li>Sharlene N. Flesher, John E. Downey, Jeffrey M. Weiss, Christopher L. Hughes, Angelica J. Herrera, Elizabeth C. Tyler-Kabara, Michael L. Boninger, Jennifer L. Collinger, Robert A. Gaunt. <strong>A brain-computer interface that evokes tactile sensations improves robotic arm control</strong>. <em>Science</em>, 2021 DOI: <a href="http://dx.doi.org/10.1126/science.abd0380">1126/science.abd0380</a></li>
</ol>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>3D &#8216;bioprinting&#8217; used to create nose cartilage</title>
		<link>https://pharmacyupdateonline.com/2021/05/3d-bioprinting-used-to-create-nose-cartilage/</link>
		
		<dc:creator><![CDATA[Charlie King]]></dc:creator>
		<pubDate>Sun, 16 May 2021 10:00:47 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[3D bio printing]]></category>
		<category><![CDATA[medical devices]]></category>
		<category><![CDATA[nose cartilage]]></category>
		<guid isPermaLink="false">https://www.pharmacyupdate.online/?p=718</guid>

					<description><![CDATA[A team of University of Alberta researchers has discovered a way to use 3-D bioprinting technology to create custom-shaped cartilage for use in surgical procedures. The work aims [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>A team of University of Alberta researchers has discovered a way to use 3-D bioprinting technology to create custom-shaped cartilage for use in surgical procedures. The work aims to make it easier for surgeons to safely restore the features of skin cancer patients living with nasal cartilage defects after surgery.</p>
<p>The researchers used a specially designed hydrogel &#8212; a material similar to Jell-O &#8212; that could be mixed with cells harvested from a patient and then printed in a specific shape captured through 3-D imaging. Over a matter of weeks, the material is cultured in a lab to become functional cartilage.</p>
<p>&#8220;It takes a lifetime to make cartilage in an individual, while this method takes about four weeks. So you still expect that there will be some degree of maturity that it has to go through, especially when implanted in the body. But functionally it&#8217;s able to do the things that cartilage does,&#8221; said Adetola Adesida, a professor of surgery in the Faculty of Medicine &amp; Dentistry.</p>
<p>&#8220;It has to have certain mechanical properties and it has to have strength. This meets those requirements with a material that (at the outset) is 92 per cent water,&#8221; added Yaman Boluk, a professor in the Faculty of Engineering.</p>
<p>Adesida, Boluk and graduate student Xiaoyi Lan led the project to create the 3-D printed cartilage in hopes of providing a better solution for a clinical problem facing many patients with skin cancer.</p>
<p>Each year upwards of three million people in North America are diagnosed with non-melanoma skin cancer. Of those, 40 per cent will have lesions on their noses, with many requiring surgery to remove them. As part of the procedure, many patients may have cartilage removed, leaving facial disfiguration.</p>
<p>Traditionally, surgeons would take cartilage from one of the patient&#8217;s ribs and reshape it to fit the needed size and shape for reconstructive surgery. But the procedure comes with complications.</p>
<p>&#8220;When the surgeons restructure the nose, it is straight. But when it adapts to its new environment, it goes through a period of remodelling where it warps, almost like the curvature of the rib,&#8221; said Adesida. &#8220;Visually on the face, that&#8217;s a problem.</p>
<p>&#8220;The other issue is that you&#8217;re opening the rib compartment, which protects the lungs, just to restructure the nose. It&#8217;s a very vital anatomical location. The patient could have a collapsed lung and has a much higher risk of dying,&#8221; he added.</p>
<p>The researchers say their work is an example of both precision medicine and regenerative medicine. Lab-grown cartilage printed specifically for the patient can remove the risk of lung collapse, infection in the lungs and severe scarring at the site of a patient&#8217;s ribs.</p>
<p>&#8220;This is to the benefit of the patient. They can go on the operating table, have a small biopsy taken from their nose in about 30 minutes, and from there we can build different shapes of cartilage specifically for them,&#8221; said Adesida. &#8220;We can even bank the cells and use them later to build everything needed for the surgery. This is what this technology allows you to do.&#8221;</p>
<p>The team is continuing its research and is now testing whether the lab-grown cartilage retains its properties after transplantation in animal models. The team hopes to move the work to a clinical trial within the next two to three years.</p>
<p>The research was supported by grants from the Canadian Institutes of Health Research, Alberta Cancer Foundation, Canadian Foundation for Innovation, University Hospital Foundation, Natural Sciences and Engineering Research Council of Canada and Edmonton Civic Employees Charitable Assistance Fund.</p>
<p><strong>Journal Reference</strong>:</p>
<ol>
<li>Xiaoyi Lan, Yan Liang, Esra J. N. Erkut, Melanie Kunze, Aillette Mulet‐Sierra, Tianxing Gong, Martin Osswald, Khalid Ansari, Hadi Seikaly, Yaman Boluk, Adetola B. Adesida. <strong>Bioprinting of human nasoseptal chondrocytes</strong><strong>‐</strong><strong>laden collagen hydrogel for cartilage tissue engineering</strong>. <em>The FASEB Journal</em>, 2021; 35 (3) DOI: <a href="http://dx.doi.org/10.1096/fj.202002081R">10.1096/fj.202002081R</a></li>
</ol>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>ASHP publishes reports exploring pharmacy&#8217;s role in future of healthcare delivery</title>
		<link>https://pharmacyupdateonline.com/2021/03/ashp-publishes-reports-exploring-pharmacys-role-in-future-of-healthcare-delivery/</link>
					<comments>https://pharmacyupdateonline.com/2021/03/ashp-publishes-reports-exploring-pharmacys-role-in-future-of-healthcare-delivery/#respond</comments>
		
		<dc:creator><![CDATA[Alex Burton]]></dc:creator>
		<pubDate>Mon, 29 Mar 2021 08:00:42 +0000</pubDate>
				<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[ASHP]]></category>
		<category><![CDATA[pharmacy]]></category>
		<category><![CDATA[robotics]]></category>
		<guid isPermaLink="false">https://puo.r2slabs.co.uk/?p=519</guid>

					<description><![CDATA[ASHP (American Society of Health-System Pharmacists) today announced the publication of two landmark reports that articulate a futuristic vision for pharmacy practice, including expanded roles for the pharmacy enterprise [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>ASHP (American Society of Health-System Pharmacists) today announced the publication of two landmark reports that articulate a futuristic vision for <a href="https://www.ashp.org/Pharmacy-Practice/PAI">pharmacy practice</a>, including expanded roles for the pharmacy enterprise in healthcare organizations. The 2021 ASHP/ASHP Foundation Pharmacy Forecast Report and the Vizient Pharmacy Network High-Value Pharmacy Enterprise (HVPE) framework, published in <a href="https://academic.oup.com/ajhp"><em>AJHP</em></a>, outline opportunities for pharmacy leaders to advance patient-centered care, population health, and the overall well-being of their organizations.</p>
<p>&#8220;During these unprecedented times, it is more important than ever for pharmacy leaders to demonstrate the value pharmacy services contribute to the health of patients and the healthcare system,&#8221; said ASHP CEO Paul W. Abramowitz, Pharm.D., Sc.D. (Hon.), FASHP, and Karl Matuszewski, M.S., Pharm.D., Vizient Vice President of Member Connections. &#8220;The HVPE framework and the Pharmacy Forecast, used in conjunction with the ASHP Practice Advancement Initiative 2030 recommendations, are essential tools to help pharmacy leaders build a comprehensive, future-focused, patient-centered pharmacy enterprise that delivers optimal outcomes through safe and effective medication use.&#8221;</p>
<p>ASHP and the ASHP Foundation issue the Pharmacy Forecast annually to identify emerging issues and serve as a tool for dynamic strategic planning for pharmacy departments and health systems. The 2021 report is based on a survey on the likelihood of 42 potential impactful events occurring within the next five years in healthcare. A 17-member committee of pharmacy practice executives and specialists advised on the content of the survey, which was sent to a national panel of 319 experts in health-system pharmacy. The Pharmacy Forecast includes sections on the global supply chain, access to healthcare, analytics and big data, healthcare financing and delivery, patient safety, the pharmacy enterprise, and the pharmacy workforce. The report offers actionable, strategic recommendations for organizations to prepare for and respond to these emerging trends and issues that impact patient care.</p>
<p>The survey responses and recommendations in the 2021 Pharmacy Forecast reflect the disruptions caused by the COVID-19 pandemic and related economic crisis. Asked to predict trends for the next five years, 91% of the surveyed pharmacy directors agreed that at least 25 states will enact provisions to expand pharmacists&#8217; scope of practice during public health emergencies. More than 90% of the survey respondents expect a significant expansion of patient access to telehealth in rural and other underserved locations. The Forecast recommends that pharmacy leaders build on the expanded use of telehealth during the coronavirus pandemic to implement permanent telepharmacy services to enhance patients&#8217; medication-related outcomes, particularly those in underserved areas.</p>
<p>The global nature of the U.S. drug supply chain was initially a concern during the COVID-19 pandemic. A vast majority of leaders in the survey believed that global issues such as trade restrictions, pandemics, or climate change increase the potential for drug shortages. The Pharmacy Forecast suggests that health systems collaborate with other organizations and local and state agencies to plan for pandemic-related surges or distribution of scarce resources such as vaccines.</p>
<p>The HVPE framework establishes eight domains that address an expansive list of topics, including patient care services in the ambulatory, specialty pharmacy, and inpatient settings; safety and quality; pharmacy workforce; information technology, including data analytics and information management; business practices; and leadership. Consensus participants set a goal of adopting the 94 evidence-based statements and 336 performance elements in health system-based medication-use processes and pharmacy practice by 2025.</p>
<p>The <a href="https://academic.oup.com/ajhp/advance-article/doi/10.1093/ajhp/zxaa429/6128834">Pharmacy Forecast</a> and the <a href="https://academic.oup.com/ajhp/advance-article/doi/10.1093/ajhp/zxaa431/6128832">HVPE framework</a> are published online ahead of print and will appear in print in the March 15, 2021 issue of <em>AJHP</em>. The Pharmacy Forecast report was supported by a donation from Omnicell to the ASHP Foundation David A. Zilz Leaders for the Future fund.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://pharmacyupdateonline.com/2021/03/ashp-publishes-reports-exploring-pharmacys-role-in-future-of-healthcare-delivery/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Robots can help combat COVID-19</title>
		<link>https://pharmacyupdateonline.com/2021/03/robots-can-help-combat-covid-19/</link>
					<comments>https://pharmacyupdateonline.com/2021/03/robots-can-help-combat-covid-19/#respond</comments>
		
		<dc:creator><![CDATA[Alex Burton]]></dc:creator>
		<pubDate>Sat, 27 Mar 2021 08:00:50 +0000</pubDate>
				<category><![CDATA[COVID-19]]></category>
		<category><![CDATA[Devices and Technology]]></category>
		<category><![CDATA[Medicines and Therapeutics]]></category>
		<category><![CDATA[Robotics]]></category>
		<category><![CDATA[covid]]></category>
		<category><![CDATA[covid-19]]></category>
		<category><![CDATA[robotics]]></category>
		<guid isPermaLink="false">https://puo.r2slabs.co.uk/?p=516</guid>

					<description><![CDATA[Can robots be effective tools in combating the COVID-19 pandemic? A group of leaders in the field of robotics, including Henrik Christensen, director of UC San Diego&#8217;s Contextual [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>Can robots be effective tools in combating the COVID-19 pandemic? A group of leaders in the field of robotics, including Henrik Christensen, director of UC San Diego&#8217;s Contextual Robotics Institute, say yes, and outline a number of examples in an editorial in the March 25 issue of <em>Science Robotics</em>. They say robots can be used for clinical care such as telemedicine and decontamination; logistics such as delivery and handling of contaminated waste; and reconnaissance such as monitoring compliance with voluntary quarantines.</p>
<p>&#8220;Already, we have seen robots being deployed for disinfection, delivering medications and food, measuring vital signs, and assisting border controls,&#8221; the researchers write.</p>
<p>Christensen, who is a professor in the Department of Computer Science and Engineering at UC San Diego, particularly highlighted the role that robots can play in disinfection, cleaning and telepresence.</p>
<p>Other co-authors include Marcia McNutt, president of the National Research Council and president of the National Academy of Sciences, as well as a number of other robotics experts from international and U.S. universities.</p>
<p>&#8220;For disease prevention, robot-controlled noncontact ultraviolet (UV) surface disinfection has already been used because COVID-19 spreads not only from person to person via close contact respiratory droplet transfer but also via contaminated surfaces,&#8221; the researchers write.</p>
<p>&#8220;Opportunities lie in intelligent navigation and detection of high-risk, high-touch areas, combined with other preventative measures,&#8221; the researchers add. &#8220;New generations of large, small, micro-, and swarm robots that are able to continuously work and clean (i.e., not only removing dust but also truly sanitizing/sterilizing all surfaces) could be developed.&#8221;</p>
<p>In terms of telepresence, &#8220;the deployment of social robots can present unique opportunities for continued social interactions and adherence to treatment regimens without fear of spreading more disease,&#8221; researchers write. &#8220;However, this is a challenging area of development because social interactions require building and maintaining complex models of people, including their knowledge, beliefs, emotions, as well as the context and environment of interaction.&#8221;</p>
<p>&#8220;COVID-19 may become the tipping point of how future organizations operate,&#8221; researchers add. &#8220;Rather than cancelling large international exhibitions and conferences, new forms of gathering&#8211;virtual rather than in-person attendance&#8211;may increase. Virtual attendees may become accustomed to remote engagement via a variety of local robotic avatars and controls.&#8221;</p>
<p>&#8220;Overall, the impact of COVID-19 may drive sustained research in robotics to address risks of infectious diseases,&#8221; researchers go on. &#8220;Without a sustainable approach to research and evaluation, history will repeat itself, and technology robots will not be ready ready to assist for the next incident.&#8221;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://pharmacyupdateonline.com/2021/03/robots-can-help-combat-covid-19/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
