1 minute read
Scientist wants center for polymers available across PH
MANILA – A Balik Scientist on Thursday expressed his wish that the Center for Sustainable Polymers (CSP) -- currently available for use in Region 10 -- be made available to other regions by making it a national center.
Balik Scientist Robert Malaluan said he is seeking the government’s support on this initiative.
Advertisement
Located at the Mindanao State University - Iligan State of Technology, the CSP was funded by the Department of Science and Technology (DOST) as a niche center for sustainable polymers across the region.
“The country has a lot of polymers, and we could make these a high-value product,” Malaluan said in a public briefing.
He cited as an example the CocoFlexSorb developed at the CSP, which seeks to help contain the oil spill in Mindoro. This bio-based polyurethane foam, reportedly has superior oil absorption capacity and can absorb different types of oil from vegetable, kerosene, engine, and bunker oil.
In the same briefing, CSP program leader Arnold Lubguban said this innovation costs four to five times less than existing petroleum-based absorbents. Further, the coco foam can be reused for about 20 to 30 times.
Malaluan said they are currently developing other products at the CSP.
“We are not just studying coconut oil but also other biopolymers, sustainable polymers that we aim to accurate portrayal.”
Biggest problem with ChatGPT is ‘absolutely accuracy’
In February 2023, OpenAI announced its plan to offer a subscription service called ChatGPT Plus.
This service provides numerous advantages to its subscribers such as quicker responses with priority access to new updates and enhancements. It costs $20 per month.
ChatGPT has some limitations, as it can give wrong responses, not just once but possibly multiple times.
OpenAI already accepts the limitations.
“ChatGPT sometimes writes a plausible-sounding but incorrect or nonsensical answer,” the company says.
Soma also agrees that the biggest problem with ChatGPT is “absolutely accuracy,” which brings some potential ethical concerns associated with the use of it in journalism such as bias or accuracy issues.
“Large language models tend to ‘hallucinate,’ and give answers to questions that are incorrect but sound accurate,” he said.
“Someone who can say ‘I don’t know’ is more trustworthy than someone who always has an answer, and it is unfortunately very difficult to cajole ChatGPT into saying ‘I don’t know.’”
Asked about the challenges journalists would face in integrating ChatGPT into their workflow, he said “fear” and a “lack of knowledge” probably are the largest issues for the news industry.
“The messaging around ChatGPT is one of those things -- it’s perfect and all-knowing, or it’s a biased garbage machine.”
“If journalists can take the time – not on deadline, not explicitly for work – to play around with ChatGPT in a guided environment, it could do a lot to help them see its strengths and weaknesses.” (Anadolu)