Tuesday, 23 June 2009

PARALLEL COMPUTING:

This topic was represented by

Mariam Mwakisisile and
Abdallah R. Sasya.

Introduction.
The topic concerns about parallel computing. In early times many software were written in serial computation which means that it has to be run in a single computer having one CPU.
CPU Is an integrated electronic device which interprets instructions to the computer, performs the logical and arithmetic processing operations, and causes the input and output operations to occur.
It shows that only a single task was to be performed at a time and the execution process was one by one.
This process took a lot of time in performing one task there it came a concept of parallel computing which allows multiple work at a time. The concept of parallel computing wad introduced by American mathematician John Von Neumann in 1947. The idea of John was that different calculations can be performed on either the same or sets of data.
Parallel computer also need powerful computer which may posses two or more processors and the CP.
Computer which possess all these will be capable of saving time, solving large problem which are complex, Provide concurrency and also allows data sharing.

Challenges.
Applying and using parallel computing is expensive because it involves more hardware.
It sometimes increases traffic on the shared memory-CPUs path.
There must be a programmer to ensure correct access of global memory.

Conclusion .
Applying parallel computing is very important because the work will be simplified and performed well. In many companies which work with a large statistical data this is PC with the ability of performing many task at a time are to be used.
This is important to understand because sometimes we work with many data and find our self complaining with the speed of computer this may be one of the reasons that needs to be observed carefully. It also helps us to know that we can increase the performance of computer by adding processor or CPU which gives computers power.

SOFTWARE ENGENEERING

This topic was presented by
Nelson Shoo and
Christine Obed.

Introduction.
Software engineering Is the systematic approach to the design, construction, development and maintenance of computer programs.

Programming languages started to appear and spread in the early 1950’s, the languages such as Fortran, Algon and Cobol were the first language to be released
In 1968-1970 the birth of software engineering emerged, this was the effect of NATO software engineering conference.
Since then until present there was major improvement of software engineering.

Why we need software engineering.
In order to overcome the crisis and the demand of the present generation.
To improve the development of software that is producing high quality, cheaper, maintainable and deliver them at time.

Types of software engineering.
There are about four types of software engineering process.
Retail software.
It is the type that sold off the shelves of retail stores.
OEM software
This refers to software sold in bulk to resellers, designed to be bundled with hardware. Eg Microsoft.
Shareware.
Is the software that can be downloaded from the internet but after a several period of time the customer have to purchase.
Freeware.
This is another type of software which is downloaded from the internet for free but for person use, while commercial use it requires a paid license.

Software development falls under the following mode.
Software requirement analysis, System analysis and design, Code generation, Testing, and Maintenance.
Software engineering tools.
The process of developing software needs some tools that supports the process.
Engineering software .
These are computer aided tools which provides automated support of development process for example computer code. These tools are referred as CASE(Computer Aided Software Engineering) .
Methodologies of Software Engineering.
Object oriented Programming (OOP).
Is the computer program that used data structure to design application and computer program.
Rapid Application development(RAD)
It refers to a type of software development life cycle which uses minimal planning in favor of rapid prototyping.
Scrum. (all at once ) Is the type of methodology where different people with experience work together so as to manage complex work, such as new product development.
Team software process.
A defined operation process frame work that is designed to help teams of managers and engineers to organize and produce large-scale software projects.

Challenges of software engineering.
The technology of developing software is confined with the following challenges;
The ongoing change of technology gives designers a hard time in implementing the needed software in the society.
Rising of complexity of requirements from the users and their expectation, this also have a great effect in the process of development of software which can satisfy the need of all people.
Failure to implement the expected software; this may be a result of poor requirements from the customer.

Conclusion.
The topic have great impact in our professional because it familiarize us with the important process involved in the development of software. To understand the needs of the customer is important thing. Programming is all about software engineering and a good programmer is not the one who know how to write code but the one end up with a good result intended by the people.

LEARNING DIARY ON ARTIFICIAL INTELLIGENT:

This Topic was presented by ;
Edward coelestine and
Dastan Josephath.

Introduction.
Artificial intelligence is the branch of computer science which imitates the intelligence of human behaviors or Is the capability of a device to perform functions that are normally associated with human intelligence e.g. reasoning, thinking, interpreting .
This technology attempts to see if the mechanical devices can be demonstrated to behave like human being.
Artificial intelligence is the result of two words, Artificial and intelligence.
Artificial: this refers to man made things, or not made of natural things but made to be like something that is real or natural.
Intelligence: the ability of learning, understanding capability, reasoning and thinking capacity.
Robotics: Is the science and technology of robots and their design, manufacture and applications
In 1950’s the work of Turing put Artificial intelligence in a new face because of his test.
Due to Turing test many scientist started to work on the artificial intelligence which lead to discovery of robots. For example the Nomad robot explores remote regions of Antarctica looking for meteorite samples.
There were other advancement of AI, with significant demonstrations in machine learning, intelligent tutoring, case-based reasoning, multi-agent planning, scheduling, uncertain reasoning, data mining, natural language understanding and translation, vision, virtual reality, games, and other topics.
The technology is characterized with some characteristics; Reasoning (e.g. Voice recognition), Multi-agent Planning, Communication ability, Identification (e.g. Finger print & The Nomad robot explores.), Manipulation of object and Interpretation.
Presently there are some artificial intelligence machine such us;
Machine translator, Location detection e.g. tags, Automatic essay assessment , Electronic sensor e.g. weapons, drugs etc, Black box in an airplane, Robots.
Artificial Intelligence is applied in many areas such as in supermarket, in scientific experimentation, in sport and games, in domestic activities, in location detection, security affairs and others.
Advantages of Artificial Intelligence.
As explained above the areas under which the technology is applied there are also some of the advantages as follows.
Artificial Intelligence simplifies work, it is efficiency and accuracy,it enhance good communication and also it reduce miscellaneous ambiguity.

Disadvantages of Artificial Intelligence.
It requires highly skilled experts, It lead into loss of job, It is hard to implement especially in third world countries and it is expensive.

My comments on the topic.
The topic is very interesting but to implement this requires a skilled person. The idea of making machine which will perform activities as human being is great but it is not easy because machines can not think and understand things like human being. The efficiency and accuracy of machines is that they do not became tied and not forget things like human being .
There is still a great deal for experts to work hard in oder to come up with machines which can resemble like human to some functions.

Thursday, 18 June 2009

REAL TIME GRAPHICS AND RENDERING

INTRODUCTION

In the field of computer science is divided in different aspects according to its dealing with.
This have been citened from the previous field such as cryptographic (deal with encryption and decryption), finger print recognition, iris recognition and so many which driven with its own function.
Computer graphics: This is an art of computer based mathematical representation of geometric objects such as building a vehicles or any object including humans. But also there are different concepts which accompanied with computer graphics.
Raster: This is an image that displayed through the arrangement of pixels on screen.
Vector images: This is an image created by using mathematical algorithms and geometry functions to represent its size and shapes.
Model: This is 2D image or 3D objects created to resemble the actual thing. Therefore this all can be decorated to make good impression.

HISTORY OF COMPUTER GRAPHICS

After the invention and rapidly increase in software engineering in 1960’s that the rapidly developed of CG and rendering. In 1961 the first video game was created and named space war by Steve Russell.
But a program which ever utilized a complete graphical user interface was created in 1963 and named sketchpad. In 1971’s most important advancements in computer graphics microprocessor was seen.
In apple released the first personal computer to use a graphical interface. Finally during 1990’s a vast of film industries present has been incorpating CG in the business which made pressure CG and rendering.
Also there are some steps in designing the images in the CGI such as pre – design (this means before drawing they must be planning first). Modeling (modeling the process of a creating models), texture, animating (the rapid display of a sequence of images of 2 –D or 3 – D) and Rendering (this referred to as a process of converting vector images to raster image).
Another technology applied is open graphics language (OGL), this OGL have the steps in drawing.


- Draw basic primitives
- Matrix operation
- Hidden surface Removal
- Lighting and shading
- Texturing
- Fixed operations
- GPU Computation

ADVANTAGES

It minimizes the cost of actual production.
It gives more flexibility and allows complex effects and movies to be produced in film industries. This is due to the integrated software which make things as reality.
In medicine field, CG allow different simulation of parts of the body and organ which make it easily in understand and critical curing.
But the main disadvantages of CG and rendering it take a lot of opportunity which could be employment to the people.

CONCLUSION
Therefore CG and rendering is useful in our daily life when it used in efficiently way, it can be more applied in building simulation, films and all new technology which can be easily in conducting modern science.

Saturday, 13 June 2009

LEARNING DIARY ON IRIS RECOGNITION:

INTRODUCTION:
The reach field on Iris recognition was presented by Jumanne Ally and Robert Mwakajwanga on 12. June. 2009. On this topic have learned a lot of things that was makes me to be interested on it let look first on introduction of iris recognition.
IRIS recognition fall under biometrics authentification the word Biometric refers to the human identification and verification using biological traits, Iris is the muscles within the eye that regulate the size of the pupil and controlling the amount of light that inters the eye.
THE CONCEPT OF IRIS RECOGNITION:
Am very interested for what I have learned by that time the issue of how iris are formed by the presenters present that the iris is formed at the prenatal stage of development of human being , growth through the process of tight forming and folding of the tissue membrane that create the formation of iris to the eye, The iris of the left side and the right part are unique.Athough genetically are identical, an individual irises are unique and structurally recognition.
MECHANISNS OF IRIS RECOGNITON:
It is another part I has learned by that time how iris are recognized or the mechanism before recognition takes place the iris is located into the landmark features then the landmark and distinct shape the allow imaging, features isolation and extraction.
Then iris can recognized by using the special camera for example commercial iris cameras typical use infrared to illuminate the iris without causing harm.
TEST THE STATISTICAL IDEPENDENCE:
To perform recognition, two iris code are compared the amount of two iris code harming distance is used as test of statistical independence between two iris code.
ADVANTAGES AND DISADVANTAGES OF IRIS RECOGNITION:
It is easer I mean in recognition when somebody taken his/her eyes may be for more identification. Not only that since the iris are unique, this ensures maximum security,also identification of persons who have limited use of hands or arms,its stability or template longevity.
Hawever iris recognition although has the advantages also have disadvantages forexample the problem of scanner height adjustment sometimes it failed to take exactly the image requared at the particular time, also the proble of small target, huge system size, the problem cost it is very high cost so that people failed to have such intrument for making iris recognition as individual the last some the usre themselves they don’t have enough skills to use the machine.
CHALLENGES OF IRIS RECOGNITION:
To this part I have learned aloat of the challenges of iris recognition among of them is distance scanning . cost, border control,environmental challeges,high quality image.
CONCLUSSION:
To insure maximum security of our country borders as personal security we should not lie behind this technology. This recognition is especially attractive due to stability of the iris texture pattern.

LEARNING DIARY ON PROGRAMMING/COMPILER:


INTRODUCTION:
Am very interested on the topic, the topic was presented by Kwangu Masalu and Geofrefy. Mhando, the following are terms that I have learned during the presentation for example, Compiler – is a program that converts from some source code (programming language) to machine language. Not only that another thing is Machine Language is the representation of computer program which is actually read and interpreted by the computer while a Program that translates from a low level language to the higher level of language.
THE CONCEPTS OF PROGRAMMING AND COMPILER:
THE HISTORY OF COMPILER
Software for early computers was primary written in assembly language., higher level programming language were not inverted until the benefit of being able to reuse software on different kind of CPU started to become significantly greater than the cost of writing a compiler.
Another thing I have learned during the presentation was different term used in programming and compiler is that Translator A language translator is a type of software that translates a program written in second, third, or higher generation language into machine language, also Interpreters translate source code instructions into machine language and executes it one statement at a time, Syntax and Semantics that is grammar rules and meaning of command.
PROGRAMMING AND COMPILER:
In this topic I have learned three types of programming language. That is Machine languages is a language directly understanding by the machine or computer, therefore it is difficult to modify, it is use computer natural language that is 0sand 1s only, another language is Assembly language and Higher – level languages these two language requiring intermediate translator steps to be well understood by the user, Assembly language is the low level programming language , examples of higher level language is like COBOL, PASCAL,FORTAN …It is allow the users to specify the desired results without having to specify detailed procedures needed for achieving the results.
COMPUTER PROGRAMMING LANGUAGE:
Important for students in all disciplines of computer science that is help to understand well all programs found in different areas that can be learned for example the use of coce to display or getting the output from any source, It increase vocabulary of useful programming language, to make it easier to design and to learn a new language another thing its to make software development, interface writing.
CHALLENGES:
Programming and Compiler also has got challenges for example currently the programming languages are challenging on learning, not only that it needs for skilled programmer to run the difficult programs, it is difficult in debugging.
CONCLUTION: Programming and Compiler this is a good field research to understand because it help the user to understand many things applying on the machine.

Thursday, 11 June 2009

LEARNING DIARY ON FINGERPRINT RECOGNITION:

Introduction:
The topic was presented by Mr.Ayoub Mugube and John Malasa there is something called Physical identification considers physical appearance, voices, and other sensory data not only that there is another terminology used in fingerprint recognition called Biometrics is the way for unique recognizing human based upon one or more intrinsic physical or behavior traits, the biometric includes face, iris, retina scanning, voice identification and others, the fingerprint is one of the most convenient and foolproof.

The Concepts of Fingerprint Recognition:
A fingerprint is the impression left upon any surface with which the finger comes in contact under pressure. Fingerprint recognition refers to the automated method of verifying two or more human fingerprints.

History of Fingerprint recognition.
In 1880 Dr. Henry Faulds was the first to publish a scientific account of the use of fingerprint as a means of identification.
In the late 1960s Galton points has been utilized to develop automated fingerprint technology.
In the 1969 FBI developed a system to automate its fingerprint identification process.
The available fingerprint continued to improve until present.

Advantages and Disadvantages of Fingerprint
Less chance of fraud, because everyone has unique fingerprints. It can be applied to Modern computers, cars; automatic doors Fingerprint can’t be lost. Fingerprints do not change naturally. Its reliability and stability is higher compared to the iris, voice, and face recognition method. Fingerprint recognition equipment is relatively low-priced compared to other biometric system. not only that then the following are disadvantages, Some criminals burn their fingers with acids. Time consuming. Don’t trust new technology Some people have damaged or eliminated fingerprints.

Classification of Fingerprints:
In the classification of fingerprint has been classify in different type Classification of human fingerprint fall into three basic patterns there is LOOPS it starts on one side and goes around to the end on the same side, this is the type accounting for about 65% of all fingerprints if you look well its like delta form. Another type is WHORLS this starts in the middle and keeps getting bigger, it looks as a bunch of circles in each other, rounded or circular ridge pattern. this kinds it look like a deltas. The last one type is ARCHES
Is a shape that starts on one side and ends on the other side like a hill but this kind there is no like a delta form.

Fingerprint visibility;
Fingerprint can be seen with naked eyes called visible prints e.g... In dried blood, plastic paint, Fingerprint that leave an impression on object (In moldable substance) e.g.. gum, soap.

Challenges.
Fingerprints stored in the database can be easily determined by hackers.
The process of storing fingerprints weakens security.
Due to technical problem some sensors do not read fingerprint images properly.
Sometimes criminals do cut their fingers.

Conclusion.
Fingerprint recognition is important regardless of the disadvantages because it uniquely give the difference accurately.
Fingerprinting is even more discriminating than DNA analysis, which, with today’s technology, cannot distinguish between identical twins.
With fingerprint no two individual can have identical fingerprints.

LEARNING DIARY ON PROGRAM/SOFT WARE VISUALIZATION:

Introduction:
The program/software visualization was presented by Rachael Myinga & Venance Luhemeja the students of Tumaini University at Iringa Region the content was presented was good and attractive even to find other materials to learn more about that. Therefore I will start defining each term independently and then defining them as a whole. By starting with program is a set of execution instructions that solves a problem or a difficult. Visualization is the making visible or to make things observable to the mind or imagination. By combining the two terms now we have the term program/software engineering which consists of producing animated views of program to be executed.

Concept of program/software visualization:
There basically two types as;
Visualization of a single component.
(Source code and quality defects during software development and maintenance activities).
Visualization of the whole (sub) system.
(investigate the architecture or to apply or visual analytics techniques for defect discovery)
This program/software visualization was aiming to achieve the goals of systematic creation of visual representations. Also it binds data to representations that can be recognized e.g. visual, auditory and tactile and specification of user explanations.

Importance of program/software visualization.
The program/software help in improving performance and Support visible more comprehensible and converting data into a graphic or graphic representation.Helps programmers to understand program behavior & code better for example when black screen you can see all the code and give the output of the code applied.

Problems;
The program/software visualization is accompanied by some sort of problems including limited screen space where it very difficult to observe how the codes are
executed. Also the problem of unfavorable condition or circumstances which does not favor the codes to run in that particular computer is still a problem. One more thing is the aspect of behavior to be visualized must be identified. Handle real-world problems.(uses of programs in small client/lab/ with few line of codes)Security upon program / soft ware. Network Latency. Designing and specifying all these are some challenges on the program/software visualization.

Conclusion;
Programs are not only files, but indeed technical publications. In order to transfer programs into publications, some principles need to be adopted, so that the codes can be
properly and nicely mapped into visible language constructs, and everybody they can understand well and apply for their benefit.

Monday, 4 May 2009

LEARNING DIARY ON USABILITY ENGINEERING AND HUMAN COMPUTER INTERACTION

Introduction:
This Topic was presented by Mr. Innocent Kihaka and Mr. Side S. Side. Actually to my opinions I can say it was the best presentation among the best presentations which have already presented 27/04/2009 the topic was good and well understood.
Before going deeper to this topic let us have a look on some important terminologies which used in this topic which might confuse sometimes if not used well.Usability; This usability means that something can be very easy to use but not necessary to be usefully. It is normally user friendly.Usefulness; Usefulness means that something is very useful but it is very difficult to use i.e. not user friendly.
THE MEARNING OF USABILITY ENGINEERING AND HUMAN COMPUTER INTERACTION.
Usability engineering (UE):
Is the technique of developing a system which is interactive, usable or meets the needs of users.
Human computer interaction (HCI):
Is the disciplines which deal with the study of interaction between users
Why this UE and HCI?Many years ago people were struggling on how to improve the efficient of the machine. The research made at that time showed that in order the efficient of the machine the computer users were diverse and there was a need of human interactive system. Over some years the ICT has developed into discipline/field in its own right.By so doing they were intending to increase the productivity as its work force is more productive. Also decreases training and supportive costs as well as increasing the customer satisfaction and decreases technophobia. However they succeed to some extent since they reduces disastrous and fatal errors in the system which are dangerous to human life.

HOW UE AND HCI CAN BE ACHIEVED
Approaches of designing products
Product-centred design
User-centred design
PRODUCT-CENTRED DESIGN
Is the production process which does not take an account of the user’s needs.
Assumptions of the product centered design
-The result of a good design is having a product working.
-The product’s specifications are derived from the customer.
- There’s no further contact until delivery.
USER-CENTRED DESIGN
Is the design practice which is rooted in the idea that user must take center -stage in the design of any computer system.
precisely to make human being a focal point that technology should serve rather than the other way round
Assumptions of the user centered-approach:
-The result of a good design is a satisfied user.
-The process of design is a collaboration between designers and users.
-User and designer are in constant communication during the entire process
Criteria of user-centred approach
Efficiency-refers to how supportive the product is in carrying out a certain task.
Memorability - refer to how easy it is to remember how the system is used once it has been learned.
Learnability-refer to how easy it is to learn to use the product.
Safety-Protecting the user from dangerous and undesirable situations.
Entertaining-if a product was for entertainments then it has to reach that goal.
Motivating- the product made should encourage the user to continue working with it
Visibility-links in the interface has to be clearly visible.
Affordance- the price of the product should relate to the income of the society.
Feedback- the interface has to respond on whatever is done and give right suggestions.
Colour- Product should take an account of colour psychology.
CHALLENGES: Hard to get requirements from users who have no technique expertise.
Due to technological advancements, the societies’ needs varies with time.
Large amount of fund is need in the development of new product.
Difficult to build a system which can suit all types of users and their needs.
FUTURE :
Computes are too cumbersome and unwieldy tools. We need a small and specific appliances for specific jobs.
Decreasing costs but increaced efficiency of products.
Increasing innovation in input techniques (i.e., voice, gesture, pen).

Monday, 27 April 2009

LEARNING DIARY ON EDUCATIONAL TECHNOLOGY.

LEARNING DIARY ON EDUCATIONAL TECHNOLOGY.


INTRODUCTION:
On 24th April, 2009. The topic discussed was Educational Technology; it is a good topic which is based on studying by helping with technology which is the debating issues in the World about the technology for example the use of computers and other peripheral for studying. The following is the fully explanation of the topic,the topic was presented by Mr. Bahart Sanga and Austin Godfrey the research field presented was good and understood well.

Educational technology (also called learning technology) is the study and ethical practice of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources. Education technology
It includes the system used in the process of developing human capability.


The Characteristics of educational technology;
It involves input, out put and process aspects of education; in this case when we are talking about the input we mean that you should have supposed materials to help and facilitate learning for example computers and peripherals that can help and faster to do work effectively. Process, it generate situation for presenting the subject -matter systematically. Input, it involves the skill of teaching and awareness of teaching and training methods. Out put, it includes mainly the terminal behaviour of the activities or it clarifies the objectives achieved through the content.

It is an application of scientific knowlege to education and training; it includes the organization of learning and conditions for realizing goals of education; it facilitates learning by control of environment, media and methods.

Major objectives of educational technology
To analyze the characteristics of the learner.
To organize the content in logical or psychological sequence
To evaluate the learner's performance in terms of achieving educational objectives.

Problems facing Educational Technology.
High cost to invest and to run these devices, and good condition area for learning most of
many areas it is not good and proper administration of these devices. Other people they failed to have the devices that helping in studies for example to buy computers and other tools facilitating in learning due to the finational problems that is a big problem facing the society, not only that even the devices still high in cost that failed people to buy.

Also we don’t have enough professional teachers and competent to help the society to use the tools of learning by using technology we have.

{Some of the Societies}
(Learned societies concerned with education technology include)‏

Association for Education Communication and Technology.(AECT). Is an academic and professional association.
Headquarters is in Bloomington, Indiana.
Move from Washington DC in 1996
Association for Leaning Technology (ALT).
Association for Leaning Technology(ALT).Is a united kingdom professional body
It brings together people with the interest in the use of learning technology.
ALT was founded in 1993.
Registered (charity number 1063519)‏
The aims of these organisation to…
Contribute to the development of policy.
Spread good practice in the use of learning technology.
Represent and support its members and provide devices for them.

Future.
In order to have a good skills in education we are supposed even to learn more about this changes of technology forexample the use of computers that will help us to develop in academic matters, therefore to have a good technology realised that all academic matters will be run faster and easy in simplification when working or applied.

Tuesday, 21 April 2009

LEARNING DIARY ON IMAGE COMPRESSION.



Am very interested on the topic of Image compression which was presented by Mr. Justine L. Ngimba and Ms Zuena Mgova on 20/04/2009 the topic was good and well understood.On top of that have learned a lot of things on Image compression and comes to realized that is good and well presentation from my colleges.

THE MEARNING OF IMAGE COMPRESSION.
Image compression is minimizing the size in bytes of a graphics file without degrading the quality of the image to an unacceptable level. The reduction in file size allows more images to be stored in a given amount of disk or memory space. It also reduces the time required for images to be sent over the Internet or downloaded from Web pages.
There are several different ways in which image files can be compressed. For Internet use, the two most common compressed graphic image formats are the JPEG format and the GIF format. The JPEG method is more often used for photographs, while the GIF method is commonly used for line art and other images in which geometric shapes are relatively simple. In other hand Image compression is the action of reducing the size of image by increasing its compactness, in image compression Data compression is the technique that is used to reduce larger data files into smaller file size.


Types of image compression.


Image compression can be lossy or lossless. Lossless compression is sometimes preferred for artificial images such as technical drawings, icons or comics. This is because lossy compression methods, especially when used at low bit rates, introduce compression artifacts. Lossless compression methods may also be preferred for high value content, such as medical imagery or image scans made for archival purposes. Lossy methods are especially suitable for natural images such as photos in applications where minor (sometimes imperceptible) loss of fidelity is acceptable to achieve a substantial reduction in bit rate.
My be let asked together why image compression? therefore what have learned from my fellow that we compress in order to make transfer of image through network faster, to saves space in storage devices, It saves time for image downloaded these are some of points that why we compress the image.

Image format.
The format of image includes:-
TIFF (Tagged Image File Format)
JPEG (Joint Photographic Experts Group)
GIF (Graphical Interchange Format)
PNG (Portable network graphics)
BMP (Bitmap)

ADVANTAGES OF IMAGE COMPRESSION.

Reduce the data storage requirements
Reduce the time for images downloaded
E-mail attachment
Web pages
Photo sharing websites

DISADVANTAGES OF IMAGE COMPRESSION.

Reduce reliability of image records
Reduction of information or bits
Time consuming.
disruption of data properties

NOTE.THIS IS VERY IMPONTANT TO KNOW IN IMAGE COMPRESSION.
In compressing an image, start from the original image since compression from compressed image leads to poor image quality.
Do not save to the new format if you think you need to still proceed in compressing and image

Monday, 20 April 2009

LEARNING DIARY ON DATA MINING:


LEARNING DIARY ON DATA MINING.

Am very interested on the topic of Data Mining which was presented by Mr Joshua Sendu and Remy Kaaro on 17/04/2009 the topic was good and well understood.
On top of that have learned a lot of things on Data Mining and comes to realized that is good and well presentation from my colleges.
THE MEARNING OF DATA MINING.

Data mining. Is the process of extracting hidden patterns from large amount of data.
Generally, data mining (sometimes called data or knowledge discovery) is the process of analyzing data from different perspectives and summarizing it into useful information - information that can be used to increase revenue, cuts costs, or both. Data mining software is one of a number of analytical tools for analyzing data. It allows users to analyze data from many different dimensions or angles, categorize it, and summarize the relationships identified. Technically, data mining is the process of finding correlations or patterns among dozens of fields in large relational databases.
BACKGROUND.
Humans have been "manually" extracting information from data for centuries, but the increasing volume of data in modern times has called for more automatic approaches in order to simplify the collection of information from data by doing so their increasing power of computer technology has aided for easier data collection, processing, management and storage. However, the captured data needs to be converted into information and knowledge to become useful easier for collection the information. Data mining is the process of using computing power to apply methodologies, including new techniques for knowledge discovery, to data.
There are three stages of data mining, which are;
Data Exploration This stage usually starts with data preparations which involve cleaning data and data transformation. Data exploration is a methodology in which manual techniques are utilized to find one's way through a data set and bring important aspects of that data into focus for further analysis. Though such a methodology can be applied to data sets of any size or type, its manual nature makes it more reasonable for smaller data sets, especially those in which the data has been carefully gathered and constructed.
Model building and validation:
This stage involves considering various models and choosing the best one based on their predictive performance can be good for use.
Deployment: This is the final stage which involves using the model selected as best in the previous stage and applying it to new data in order to generate predictions or estimates of the expected outcomes
There are several advantages of the data mining then he following are some of them, banking here the bank can find the good information and then they can analyses for a good manner then advertise to the people then they can learn very well, another advantages like Marking, for the researchers, Law enforcement.
In other hand regardless there are some advantages, data mining also have some disadvantages then the following are as follows security in case of this it need a very highly security to protect the data if the technology used to store may be to collect the information is not good or properly it leads to lost the information quickly,Misuse of information; if the information not used properly it can be easier for other people to know the secret information that not need to be known by the other people, Privacy issues; there are some issues which people do not want them to be known as public issues that is why are kept in hidden.

Conclusion: Data mining is a good topic and people should understand well for there benefit for example to hide the information not each to be public ally. Therefore the topic was good.



Tuesday, 7 April 2009

AUTOMATIC ESSAY ASSESSMENT.


LEARNING DIARY ON
AUTOMATIC ESSAY ASSESSMENT.


The research field was conducted by Mr. Nyambalafu and Miss Rose Mchaka on 6th April 2009. The topic was good and understood well but for the first time it was very difficult to me to understand the meaning of Automatic Essay Assessment, now am feel okey on that.
The meaning of Automatic Essay Assessment.
Refers to the use of an electronic machine to make essay evaluation, where by an electronic machine can determine for example the grade of the students, even the machine can avoid the plagiarism, the machine is able to show if the sentences has plagiarism, analyze learning outcome (make evaluation).


The following are the Advantages or importance of Automatic essay assessment.
These are categoriesed into two parts.
To the teacher:
It saves time in marking process.
It saves time in making evaluation.
It helps to make a final grader.
It helps a teacher to be consistency.
To the students.
It challenges students to think critically.
It provides real feedback to the students.
It looks the qualities of the students work and not comparisons with others.
It supports students to reflect learning process.

Therefore to have an electronic machine of doing such activities it is a good plan because it helps in simplification of work but the problems comes It is very cost full for making program because it need a lot of money to buy the program, It reduce thinking capacity of the teacher for example in steady of doing harder machine help to simplify each and everything,It needs high qualified people on setup


Futures.
To have a machine which can assess each and everything.
To give back the whole essay with all corrections.
To have a machine which can assess hardcopy
essay.

Monday, 6 April 2009

SPEAKER RECOGNITION.


LEARNING DIARY ON
SPEAKER RECOGNITION:
This topic was prepared by the students of Tumaini University College.Mr. Benjamin Mkwese and Ruben Minael first year students of Bachelor of Science in Information and communication Technology.
The research field was good and understood well: for the first time it was very difficult to select this topic because of fear and to thing where i can get the material for presentation therefore at the end of preparation come to conclude that it is good well topic.
As human beings, we are able to recognize someone just by hearing him or her talk. Usually, a few seconds of speech are sufficient to identify a familiar voice. The idea to teach computers how to recognize humans by the sound of their voices is quite evident, as there are several fruitful applications of this task.
SPEAKER RECOGNITION:
Speaker recognition. Is the process of automatically recognizing who is speaking on the basis of individual information included in speech signals.
SPEAKER RECOGNITION HAS BEEN CATEGORISED INTO TWO CLASSES:
It can be divided into Speaker Identification and Speaker Verification. Speaker identification determines which registered speaker provides a given utterance from amongst a set of known speakers. Speaker verification accepts or rejects the identity claim of a speaker - is the speaker the person they say they are?
Speaker recognition technology makes it possible to a the speaker's voice to control access to restricted services, for example, phone access to banking, database services, shopping or voice mail, and access to secure equipment.

HOW THE SYSTEM WORKS:
Another thing has learned to this topic it is about how the system works.
The voice input to the microphone produce an analogue speech signal. An analogue to digital converter (ADC) converts this speech signal into binary words that are compatible with digital computer. The converted binary version is then stored in the system and compared with previously stored binary representations of words and phrases.
The current input speech is compared one at a time with the previously stored speech pattern after searching by the computer. When a match occurs, recognition is achieved. The spoken word is binary form is written on a video screen or passed along to a natural language understanding processor for additional analysis.
During training, the computer displays a word and the user reads it aloud. The computer digitizes the user’s voice and stores it. The speaker has to read aloud about 1000 words. Based on these samples, the computer can predict how the user utters some words that are likely to the pronounced differently by different people.
It is noted that even if the same speaker talks the same texts. there are always slight variation in amplitude or loudness of the signal,pitch,frequency difference, time gap etc. Due to this reasons there is never a perfect match between the templates and binary input word.

APPLICATION:
One of many application of speaker recognition is in military operations. Voice control of weapons is an example with reliable speech recognition equipment; pilots can give commands and information to the computers by simply speaking into their microphones they don’t have to use their hands for this purpose.
CONCLUTION:
The topic was good and well understood therefore to have this is a good top and better to register to the system inoder to secure the materials you have may be in the machine such as computer.

Thursday, 2 April 2009

CRYPTOGRAPHY.


CRYPTOGRAPHY.
The research field was done by Mr. Izadin Abdalah and Mr. Joseph Uwemba.First year IT students from Tumaini University College.
For the first time it was very difficult to understand the term CRYPTOGRAPHY even to select to be my research field . But after getting the concepts, knowledge about the cryptography from my fellows now am well understand the terminology. (Cryptography).

The following are the very important things what I have learned in cryptography:
Cryptography:
This term come from the two creek words. That is kriptos-it means hidden and graphen which means writing: Therefore cryptography is the study of hiding the information. It provides a great tool for protecting the information. Therefore what have learned here that this terminology explain the process of hiding information, in order to avoid things to be lasted.
Also another thing that I have learned on this top is that: Mr. Izadin Abdalah and Joseph Uwemba to their field they have said that the I am of this research field is only as an IT professional to use cryptography as a security tool of the computer system.
Not only that the cryptography was used as a tool to protect national secretes and strategies: all this technology was used by Egyptians some years ago.
The components of cryptography: are as follows.
-An algorithm (or cryptographic method and another component was:
-Keys:
But cryptographic has been classified into difference ways some of them are.
Secret key system: (also know as symmetric systems), public key system (also known as asymmetric system. Then what I have learned here for secret key it explains that uses only a single key for both encryption and decryption.
Due to that, I mean in secret key only one key can be used for example for the first time close and to open only one key can be applied at the same time.
Public key uses pair of key to open and to close for security. (Used for encryption, due to what Mr. Joseph Uwemba and Izadin Abdalah they have said this type of cryptography it is a modern one for application compared to the other, also used it to make a digital signature capability )
In fact that discussion was good and understood well to me.