Research Article Volume 7 Issue 1
Department of Public Health, College of Applied Medical Sciences, King Khalid University, Saudi Arabia
Correspondence: Arif Habib, Department of Public Health, College of Applied Medical Sciences, King Khalid University, Abha, Saudi Arabia
Received: December 21, 2017 | Published: February 12, 2018
Citation: Habib A. Some new results on fuzzy generalized ‘useful’ mean lengths and its bounds. Biom Biostat Int J. 2018;7(1):54-58. DOI: 10.15406/bbij.2018.07.00190
In this paper, we propose a fuzzy generalized ‘useful’ parametric mean length and bounds are obtained in terms of useful fuzzy measure. The bounds obtained are new and some known results are the particular cases of the proposed measure and bounds.
Keywords: fuzzy sets, shannon’s inequality, generalized shannon’s inequality, coding theorem, kerridge inaccuracy
AMS subject classification: 94A17, 94A24.
Fuzzy sets play a significant role in many deployed system because of their capability to model non-statistical imprecision. Consequently, characterization and quantification of fuzziness are important issues that affect the management of uncertainty in many system models and designs. The notion of fuzzy sets was proposed by Zahed1 with a view to tackling problems in which indefinites arising from a sort intrinsic ambiguity plays a fundamental role. Fuzziness, a texture of uncertainty, results from the lack of sharp distinction of the boundary of set. The concept of fuzzy sets in which imprecise knowledge can be used to define an event. The concept of entropy has been widely used in different areas, e.g. communication theory, statistical mechanics, pattern recognition, diagnostic and neural network etc.
A fuzzy set is represented as
where gives the degree of belongingness of the element ‘’ to the set ‘A’. If every element of the set ‘A’ is ‘0’ or ‘1’, there is no uncertainty about it and a set is said to be a crisp set. On the other hand, a fuzzy set ‘A’ is defined by a characteristic function
The function associates with each grade of membership function.
A fuzzy set is called a sharpened version of fuzzy set A if the following conditions are satisfied:
and
De Luca and Termini2 formulated a set of properties and these properties are widely accepted as criterion for defining any fuzzy entropy. In fuzzy set theory, the entropy is a measure of fuzziness which expresses the amount of average ambiguity in making a decision whether an element belong to a set or not. So, a measure of average fuzziness is fuzzy set H(A) should have the following properties to be a valid entropy.
The importance of fuzzy set comes from the fact that it can deal with imprecise and inexact information. Its application areas span from design of fuzzy controller to robotics and artificial intelligence.
Let be discrete random variable taking on a finite number of possible values with respective membership function gives the elements the degree of belongingness to the set A. The function associates with each a grade of membership to the set and is known as membership function.
Denote
(2.1)
We call the scheme (2.1) as a finite fuzzy information scheme. Every finite scheme describes a state of uncertainty. De Luca and termini2 introduced a quantity which, in a reasonable way to measures the amount of uncertainty (fuzzy entropy) associated with a given finite scheme. This measure is given by
(2.2)
The measure (2.2) serve as a very suitable measure of fuzzy entropy of the finite information scheme (2.1).
Let a finite source of n source symbols be encoded using alphabet of D symbols, then it has been shown by Feinstein3 that there is a uniquely decipherable instantaneous code with lengths iff the following Kraft4 inequality is satisfied
(2.3)
Belis & Guiasu5 observed that a source is not completely specified by the probability distribution P over the source alphabet in the absence of qualitative character. So it can be assumed (Belis & Guiasu5) that the source alphabet letters are assigned weights according to their importance or utilities in view of the experimenter.
Let be the set of positive real numbers, is the utility or importance of . The utility, in general, is independent of probability of encoding of source symbol i.e. . The information source is thus given by
(2.4)
Belis & Guiasu3 introduced the following quantitative- qualitative measure of information
(2.5)
Which is a measure for the average of quantity of ‘variable’ or ‘useful’ information provided by the information source (2.4).
Guiasu & Picard6 considered the problem of encoding the letter output by the source (2.4) by means of a single letter prefix code whose code word’s are of lengths respectively and satisfy the Kraft’s inequality (2.3), they included the following ‘useful’ mean length of the code
(2.6)
Further they derived a lower bound for (2.6).
Now, corresponding to (2.5) and (2.6), we have the following fuzzy measures
(2.7)
and
(2.8)
respectively.
In the next section, fuzzy coding theorem have been obtained by considering a new parametric fuzzy entropy function involving utilities and generalized useful fuzzy code word mean length. The result obtained here are not only new, but also generalized some well-known results available in the literature of information theory.
Consider the following model for a random experiment S,
Where is a finite system of events happening with respective probabilities and and credited with utilities , Denote the model by where
(3.1)
We call (3.1) a Utility Information Scheme (UIS). Bilis & Guiasu3 proposed a measure of information called ‘useful information’ for this scheme, given by
(3.2)
Where reduces to Shannon’s7 entropy when the utility aspect of the scheme is ignored i.e., when for each
Guiasu & Picard6 considered the problem of encoding the outcomes in (3.1) by means of a prefix code with codewords having lengths and satisfying Kraft’s inequality4
(3.3)
Where D is the size of the code alphabet. The useful mean length L(U) of code was defined as
(3.4)
and the authors obtained bounds for it in terms of
Now, corresponding to , we have the following fuzzy measures
(3.5)
and
(3.6)
Longo,8 Gurdial & Pessoa,9 Khan & Autar,10 Autar & Khan11 have studied generalized coding theorems by considering different generalized measures of (3.2) and (3.4) under condition (3.3) of unique decipherability.
In this paper, we study some coding theorems by considering a new function depending on the parameters and and a utility function. The motivation for studying this new function is that it generalizes some entropy functions already existing in the literature. The function under study is closely related to Tsallis entropy which is used in physics.
Consider a function
(4.1)
Where
Further consider,
(4.2)
Where
Theorem 4.1. For all integers
(4.3)
Under the condition (3.3). Equality holds if and only if
(4.4)
Proof. We use Holder’s Inequality
;
For all when and with equality if and only if there exists a positive number such that
(4.6)
Setting
in (4.5) and using (3.3) we obtain the result (4.3) after simplification for
Theorem 4.2. For every code with lengths can be made to satisfy,
(4.7)
Proof. Let be the positive integer satisfying, the inequality
(4.8)
Consider the intervals
(4.9)
of length 1. In every there lies exactly one positive number such that
(4.10)
It can be shown that the sequence thus defined, satisfies (3.3). From (4.10) we have
(4.11)
Multiplying both sides of (4.11) by summing over and simplifying, gives (4.7).
Theorem 4.3. For every code with lengths of Theorem 4.1, can be made to satisfy,
(4.12)
(4.13)
Clearly and satisfy ‘equality’ in Holder’s inequality (4.5). Moreover, satisfies Kraft’s inequality (3.3).
Suppose is the unique integer between and then obviously, satisfies (3.3).
Since we have
(4.14)
Since
Hence, (4.14) becomes
Which gives the result (4.12).
Error correcting codes constitute one of the key ingredients in achieving the high degree of reliability required in transmission of information. Therefore, we find the minimum value of useful mean lengths subject to a given constraint on code- word lengths. It may be seen that the "useful" mean lengths had been generalized parametrically and their bounds had been studied in terms of generalized measures of entropies.
We establish a result, that in a sense, provides a characterization of under the condition of unique decipherability. The main objective of information is to remove uncertainty and fuzziness. The measured information supplied by the amount of probabilistic uncertainty removed in an experiment and the measure of uncertainty removed is also called as a measure of information, while measure of fuzziness is the measure of vagueness and ambiguity of uncertainties. The results with the proofs obtained in the theorem 4.1 – 4.3, not only generalize the existing fuzzy mean lengths and its bounds but all the known results of the particular cases of the proposed length. Some new fuzzy coding theorems have also been proved.
None.
None.
©2018 Habib. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.
2 7