Free download. Book file PDF easily for everyone and every device. You can download and read online MedReminder - an interactive multimedia medical application for the IPTV environment file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with MedReminder - an interactive multimedia medical application for the IPTV environment book. Happy reading MedReminder - an interactive multimedia medical application for the IPTV environment Bookeveryone. Download file Free Book PDF MedReminder - an interactive multimedia medical application for the IPTV environment at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF MedReminder - an interactive multimedia medical application for the IPTV environment Pocket Guide.
Refine list

Keywords : computer vision; foreground object detection; background subtraction; video degradation. Each of the dynamic labeling schemes proposed till date differs in characteristics and has its own advantages and limitations. They may differ in terms of the query supported, their update performance, label size etc. In this paper, a new prefix based labeling scheme is proposed which is compact, dynamic.


  • 2010 – today.
  • Usability Pitfalls of Diabetes mHealth Apps for the Elderly..
  • The Cambridge Companion to Modern Chinese Culture (Cambridge Companions to Culture)?
  • [PDF] User-Centred E-Health: Engaging Users into the e-Health Design Process - Semantic Scholar;
  • As Others See Us: Body Movement and the Art of Successful Communication.

And, it also facilitates the computation of structural relationships which is the core part of query processing. The proposed scheme can handle both static as well as dynamic XML documents. The experimentation is conducted to evaluate the performance of storage requirement, structural relationship computation and update processing.

The result is compared with some of the existing labeling mechanisms. Keywords : Labeling Scheme; XML; Structural relationship; dynamic update; ancestor-descendant; parent-child relationship. The power of cloud computing is enormous as it provides big services in an optimal cost as well as in a reliable manner. Load balancing of tasks in the cloud server is an important issue to be addressed. In this paper, we propose a task clustering algorithm to minimize the load across the cloud servers through content based load balancing of tasks using task clustering methods and cost reduction method for optimal energy consumption at all the cloud data center heads.

The results analysed in our paper are better when compared with existing content based load balancing models. Our approach clearly represents the achievement of optimal load balancing of tasks with respect to upload bandwidth utilization, minimal latency and some other QoS Quality of service metrics. Keywords : Cloud computing; load balancing; tasks clustering; cost reduction; energy consumption; QoS Quality of service metrics. The profit can be gained when the maximum demand is satisfied.

The demand can be satisfied when maximum number of customers are covered or served. To attain maximum number of customers, there are various approaches have been investigated.

In general, most of the approaches consider for the facility location models are based on radius as a service area of facility. Therefore, such facilities which fulfill their service in a radius can be served by conventional approach. However, conventional approaches fail to allocate those facilities which are not inclined by topographical and road network barriers. In this paper, we propose a model to optimized facility allocation in such scenarios. In the propose model, we have used a two step clustering approach to solve the facility location problem. Experimental results illustrate that the proposed algorithm based on density affinity propagation DAP for the Facility location problem can be used to construct a solution for maximal service and covering area.

Vasudevan, R. Virtual reality on the other hand keeps the user isolated from the real world and immersed in a world that is completely fabricated. The main objective of this research is to capture a real life image and augment it as a component of a gaming environment using the principles of augmented reality. For this research implementation, we have chosen car racing as our gaming environment.

The core elements are the image segmentation using CIELAB color space based graph cut algorithm, 2D to 3D modelling, and game development with augmented reality. The tools utilised are Mat Lab, insight3d and Unity3D. The proposed idea will enable someone to view a virtual environment with real components that are integrated dynamically.

Swipe to navigate through the chapters of this book

The dispersion coefficient Kx was the decision parameter for the proposed machine learning models. MARS does not assume any functional relationship between inputs and output. The MARS model is a non-parametric regression model that splits the data and fits each interval into a basis function.

DAW Cassette

MPMR is a probabilistic model which maximizes the minimum probability of predicted output. MPMR also provides output within some bound of the true regression function. Finally, the performances of the models have been measured by different performance metrics. The model is based on the human brain, with automatic and distributed pattern activity.

Methods for carrying out the different processes are suggested. The main purpose of this paper is to reaffirm earlier research on different knowledge-based and experience-based clustering techniques. The overall architecture has stayed essentially the same and so it is the localised processes or smaller details that have been updated. For example, a counting mechanism is used slightly differently, to measure a level of cohesion instead of a correct classification, over pattern instances.

The introduction of features has further enhanced the architecture and the new entropy-style equation is proposed. While an earlier paper defined three levels of functional requirement, this paper re-defines the levels in a more human vernacular, with higher-level goals described in terms of action-result pairs.

Site Map: (M) - appscom

Keywords : Cognitive model; distributed architecture; entropy; neural network; concept tree. One of these barriersrnis the lack of suitable training data, both in quantity and quality. The aim of thisrnstudy is to investigate the effect of cross-corpus data on automatic classification ofrnemotional speech. Wernevaluate on three different emotional databases from three different languagesrn English, Polish, and German following a three cross-corpus strategies. In the inter-corpus scenario, the obtained average recall is Resource allocation followed by competent scheduling of tasks is of crucial concern in cloud computing.

Load balancing is assigning incoming job-requests to resources evenly so that each involved resources are efficiently utilized. Number of cloud users are immense and volume of incoming job-request is arbitrary and data is enormous in cloud application. In cloud computing resources are limited, therefore it is challenging to deploy various applications with irregular capacities as well as functionalities in heterogeneous multi-cloud environment.

In this paper Genetic Algorithm based task mapping, followed by priority scheduling in multi-cloud environment is proposed.

MedReminder - An Interactive Multimedia Medical Application for the IPTV Environment

The proposed algorithm has two important phases, namely mapping and scheduling. Performed rigorous simulations on synthetic data for heterogeneous multi-cloud environment. Validity of mapping and scheduling clearly proves better performance of the entire system in terms of makespan time and throughput.

These networks provide communication services between nearby vehicles and between vehicles and roadside infrastructure that improve road safety and provide travelers' comfort. Due to the characteristics of VANET, such as self-organization, low bandwidth, variable network density, rapid changes in network topology, providing safe driving, enhancing traffic efficiency, etc. A lot of research has been performed for providing efficient and secure routing protocol. In this paper, we investigate and compare various routing protocols based on swarm intelligence and key distribution in VANET.

Wireless sensor networks are one of the popular emerging technologies that are deployed commonly in harsh environment. These networks main dependency is on battery powers. Our mission is to reduce the energy consumption as much as possible. Every routing protocol has been designed for sensor network based on minimum energy consumption. In this paper, LEACH protocol has been modified with various shortest path algorithms to find out best performance of sensor network.

Simulation result shows that Dijsktra algorithm has found to be better among other algorithms. This optimization algorithm is based on the activities of shareholders to maximize their profit in the Exchange Market.

HYFY TV 2019 NEW android iptv apk gratuite

The uniqueness of this algorithm lies in the fact that, it enjoys double exploitation and exploration property unlike several other algorithms. In order to investigate its search capability, the EMA is utilized to solve power systems active and reactive related objectives simultaneously in presence of several non-linear constraints.

Fuel cost Active related objective , Transmission Line Loss and Total Voltage Deviation reactive related objectives are taken as different objective functions. The multi-objective optimization problem is performed through weighted sum approach. Both fuzzy and equal weight approach is utilized to declare the compromised solution.

The search capability of EMA in solving the multi-objective power system problems are compared with PSO based solutions. Keywords : Optimal Power Flow; Exchange Market algorithm; multi-objective optimization; Pareto front; fuzzy decision making. But due to randomised mobility and different service class of applications, the connection failure rate increases, which can be overcome through handover HO.

With the increased demand for handovers, the number of networks scanned for decision making and the number of negotiations for connectivity become too large. To improve their efficiency, a three tier model is proposed, where requests for similar type are grouped and a common negotiation is made to reduce the number of communication messages. Only qualified networks among all the reachable access points are chosen for decision. Handover need estimation is performed to reduce the unwanted handovers. Finally, an adaptive resource management is made possible through a group based call admission control GB-CAC algorithm that harmonises up to 50 percent of the resource utilisation, ensuring higher numbers of connections with negligible percent call blocking and dropping.


  • Inherent safety at chemical sites : reducing vulnerability to accidents and terrorism through green chemistry.
  • Multiple Problem Youth: Delinquency, Substance Use, and Mental Health Problems!
  • Womans Consciousness, Mans World (A pelican original).
  • View PDF - JMIR Research Protocols.
  • The Evangelical Origins of the Living Constitution!
  • Keys to an Open Heaven.

Keywords : Point of Attachment; handover; candidate networks; elimination factor; queues; Quality of Service; Smart Terminal. Certain data mining techniques such as Bayesian networks, induction rules or association rule mining can be applied only on discretized nominal data. Various studies show significant improvement for certain data mining techniques, when applied on discretized data rather than continuous data.

Several discretization methods have been reported in literature, which are based on statistical techniques. Such statistical techniques are inadequate in capturing and exploiting the underling knowledge inherent in data and context of study. Big data with high dimension, and unavailability of any a priori knowledge on the study context, even make the situation miserable.

Citations per year

To overcome this limitation, we propose a novel knowledge based semantic discretization method using data mining techniques, in which discretization is done based on Semantic data. Semantic data is domain knowledge inherent in the data itself and context of the study. Unlike semantic data mining, no explicit ontology associated with the data for semantic discretization.

Therefore, its a challenging task to identify, capture, interpret and exploit the semantic data for semantic discretization. This study presents the novel concept of semantic discretization and demonstrate the application of data mining techniques in extracting semantic data, which is further used in knowledge based semantic discretization.

We show the effectiveness of the proposed methodology by applying it on Pima Indian Diabetes dataset, which is a standard dataset, taken from UCI Machine learning repository. Cryptography and steganography are well known methods available to provide security where the former use techniques that control information in order to cipher or hide their presence and the latter concentrates on data concealment. Steganography is the practice of masking data especially multimedia data within another data. Visual contents gets more importance from people compared to audio contents and moreover visual content file is huge when compared to audio file thereby helping increase robustness of the hiding algorithms.

In this paper, we consider three domains in which the image steganography algorithms are proposed along with the experimentation results on USC-SIPI image database which prove the betterment of the algorithms as compared with the traditional algorithms.

We propose to use rule based LSB substitution method in spatial domain, XOR based hiding in frequency domain and data encryption standard based embedding in wavelet domain. We find that the proposed algorithms have a better PSNR value averaging close to 53 after embedding the secret data, while the existing algorithms has values of around This work is concentrated on developing machine learning algorithms combined with a mathematical model for classifying malignant or benign images in digital mammograms.

The mathematical concept of the fuzzy soft set theory is advocated here, which is an extension of crisp and fuzzy with parameterization.