<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
  <channel>
    <title>UCSC Digital Library Collection:</title>
    <link>https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4140</link>
    <description />
    <pubDate>Thu, 26 Mar 2026 11:08:41 GMT</pubDate>
    <dc:date>2026-03-26T11:08:41Z</dc:date>
    <item>
      <title>Investigation of hidden patterns in Qmatic Customer Journeys in order to minimize the average waiting time</title>
      <link>https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4183</link>
      <description>Title: Investigation of hidden patterns in Qmatic Customer Journeys in order to minimize the average waiting time
Authors: Wijerathna, D.D.U.B.
Abstract: A queue management system is a critic component in any sector of business. In&#xD;
fact, all of the queue management systems have the same goal which is to minimize&#xD;
the waiting time for the customers who are in queues.&#xD;
The aim of this study is to investigate the predictive variables which explain waiting time in bank virtual queues of Qmatic customer journeys. The Knowledge&#xD;
Discovery in Databases which is the standard data mining process, is employed as&#xD;
the methodology to reach the target of discovering the hidden patterns. As the&#xD;
data mining engine a GBM model is used for waiting time estimation. One of the&#xD;
most substantial measures in creating the regression model is selecting the optimal&#xD;
feature sets for predicting waiting time. According to the results, the best results&#xD;
obtained when learning rate equals to 0.1, max depth equals to 4, min sample&#xD;
leaves equals to 2 and max feature equals to 0.3. For the training processes, 10-&#xD;
Fold cross-validation is applied. The overall accuracy of the model is 71%.</description>
      <pubDate>Thu, 22 Jul 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4183</guid>
      <dc:date>2021-07-22T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Simulation of Climbing Plants with Twining Behaviour</title>
      <link>https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4182</link>
      <description>Title: Simulation of Climbing Plants with Twining Behaviour
Authors: Wickramaratne, W. O. N.
Abstract: Vegetation simulation in computer graphics is a highly active research area due&#xD;
to its vast variety of applications. Among various types of vegetation, climbing&#xD;
plants are complex to simulate because of their rigid and soft body characteristics.&#xD;
Though state of the art models were able to simulate the climbing plants in a visually realistic manner none of them was able to simulate the biomechanical behaviors&#xD;
of the twining plant. As twinning plants which is a subcategory of climbing plants&#xD;
show abstract mechanical properties, a robust biomechanically accurate model will&#xD;
facilitate many other research areas and it will fill the void on visualizing point of&#xD;
interest scenes of twining plants growth in gaming and cinema industries.&#xD;
Previously, an attempt to fill the void in this area was made by Gunawardhana et al. their proposed model was jittery and it lacked stability. The proposed&#xD;
research was initiated as a continuation of this previous attempt with an aim to&#xD;
synthesize a robust model that can simulate a twining plant in a biomechanically&#xD;
accurate manner by improving the previous model. However, this approach had to&#xD;
be rejected due to its limitations and a state machine driven particle-based novel&#xD;
model was proposed instead. This novel model was able to simulate the effect of&#xD;
circumnutation behavior, twining behavior, seed germination process, secondary&#xD;
growth, and leaf growth in a biomechanically accurate manner. The proposed&#xD;
model was evaluated in terms of realism and performance using a survey and standard performance evaluation measures.</description>
      <pubDate>Thu, 22 Jul 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4182</guid>
      <dc:date>2021-07-22T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Lip Synchronization Modeling for Sinhala Speech</title>
      <link>https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4181</link>
      <description>Title: Lip Synchronization Modeling for Sinhala Speech
Authors: Weerathunga, W.A.C.J.
Abstract: Lip synchronization also known as visual speech animation, is the process of&#xD;
matching the speech with the lip movements. Visual speech animation has a great&#xD;
impact in the gaming and animation film industry, due to the reason that it provides&#xD;
the realistic experience to the users. Furthermore, this technology also supports&#xD;
better communication for deaf people.&#xD;
For most of the European languages, lip synchronizing models have been developed and used vastly in the entertainment industries. However, there are still&#xD;
no research experiments conducted towards the speech animation of the Sinhala&#xD;
language. This is due to the Less contribution towards research development and&#xD;
unavailability of resources.&#xD;
This research is focused on the problem of achieving a lip synchronization model&#xD;
for the Sinhala language. This project presents a study on how to map from&#xD;
acoustic speech to visual speech with the goal of generating perceptually natural&#xD;
speech animation.&#xD;
Initially, this study follows a deep learning approach and terminates due to not&#xD;
having enough video data to achieve a good performance. Next, the experiments&#xD;
on developing a visemes alphabet is carried out using a static visemes approach on&#xD;
a video data set created by the author.&#xD;
The implemented lip synchronizing model was evaluated using a subjective evaluation based on six different categories. The model achieved a 69% accuracy for the&#xD;
subjective evaluation using the static visemes approach. As an initiative research&#xD;
on speech animation for the Sinhala language, this model accurately animates individual words rather than long sentences.</description>
      <pubDate>Thu, 22 Jul 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4181</guid>
      <dc:date>2021-07-22T00:00:00Z</dc:date>
    </item>
    <item>
      <title>Improving User Experience with Multi-Dimensional Factors for Effective Interactions in a Dynamic Environment</title>
      <link>https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4180</link>
      <description>Title: Improving User Experience with Multi-Dimensional Factors for Effective Interactions in a Dynamic Environment
Authors: Wattearachchi, W.D.
Abstract: In the field of Human-Computer Interaction (HCI), improving User Experience (UX)&#xD;
of mobile devices has become a necessity due to the emergence of smart technologies&#xD;
and with the popularity of using mobile devices in day to day life rather than&#xD;
traditional desktop systems. The main aim of this research is to develop a model for a&#xD;
mobile device that can suggest adaptive functionalities, based on the current user&#xD;
emotion and the context. To the best of our knowledge, there are not any systems&#xD;
which provide not only adaptive interfaces but also provides adaptive functionalities&#xD;
within a mobile device which will enhance the acceptability and usability of that&#xD;
particular system.&#xD;
As a proof of concept, a keyboard named “Emotional Keyboard” was developed&#xD;
through five prototypes using Evolutionary Prototyping, iteratively. As the&#xD;
methodology, Action Research together with User-Centered Design (UCD) was&#xD;
followed which also included two user surveys. Initial decisions were taken after&#xD;
conducting the first survey and Prototypes 1, 2 and 3 have been developed and those&#xD;
have been evaluated with the participation of 40 users. Prototype 3 has incorporated an&#xD;
Artificial Neural Network (ANN) that was trained using the data collected during the&#xD;
evaluations of the Prototypes 1 and 2, which can decide the most optimal emotion by&#xD;
combining the emotions from facial expressions and text. Consequently, Prototype 4&#xD;
and 5 were developed which can suggest the most affective function based on the&#xD;
emotion and the context (location, time and user activity), by incorporating the data&#xD;
that was collected by conducting the second survey and built a “Preference Tree”&#xD;
which consists the probabilities of choosing functions and also considering the&#xD;
frequently used functions. The evaluation of Prototype 4 and 5 have been carried out&#xD;
with the participation of 18 users, where an individual and general analysis has been&#xD;
performed and proved that with time the model was able to correctly suggest adaptive&#xD;
functions. This evaluation yields the conclusion of the research, thus paving the way to&#xD;
an “Adaptive System with User Control” to improve the acceptability and the usability&#xD;
of a mobile device which aligns with the research aim</description>
      <pubDate>Thu, 22 Jul 2021 00:00:00 GMT</pubDate>
      <guid isPermaLink="false">https://dl.ucsc.cmb.ac.lk/jspui/handle/123456789/4180</guid>
      <dc:date>2021-07-22T00:00:00Z</dc:date>
    </item>
  </channel>
</rss>

