Knowledge is knowledge

Introduction to Motors ,Generators and transformers

In this subject in our most of the time, we deal with rotating machines especially electromechanical energy conversion machines. DC as well as AC. we deal with motors and generators.

Motors: It is the machine used to convert electrical energy into mechanical energy.

Generators: It is a machine used to convert mechanical energy into electrical energy.

We deal with these machines in the subject of electrical machines but at the time of generation of electricity it is of the order of only 10-12kV but at the time of transmission it is of the order of 465-800kV so here the need of transformers comes into the picture. These are of two types 

Step-up transformer: whenever we need to convert the voltage for the production level ( 10-12kV ) to the transmission level(465-800kV) we use this type of transformer.

Step-down transformer: whenever we need to convert the voltage from the transmission level to the domestic level we use this type of transformer.



Share:

Introduction to Field Effect Transistors (FET's) also similarity and difference

 Introduction to Field Effect Transistors (FET's)  also similarity and difference

  • BJT and FET Both are 3 terminal devices
  • BJT and FET's have nearly equal importance.
  • BJT is a current-controlled device while FET's are voltage-controlled devices its the main difference between them. ( Ic=f(Ib) and Id = f(Vgs) )
  •  BJT is a bipolar device and FET's are unipolar devices.
  • FET's are of two types n -channel FET's and p-channel FET's.
  •  FET's can be used as amplifiers or switches.
  •  FET's are classifieds into two categories JFET and MOSFET
  • JFET is classified into two categories n -channel and p- channel.
  •  MOSFET is classified into DMOSFET and EMOSFET. and bother of them further divided into two categories n-channel and p-channel.
  • FET's are known as Field effect transistors because there is an electric field is developed by the charge is present and this electric field controls the conduction path of the output circuit.So there is an effect of electric field and so this device is known as Field effect transistors.
  • Input impedance: FET's have higher input impedance as compared to BJT's
  • FET's are more temperature stable then BJT.
  •  FET's are smaller then BJT.
  •  BJT's are more sensitive to the applied signal as compared to the FET's.


Share:

Some Best Animes

Naruto


This anime is one of the best animes I have ever seen. This anime is about a boy named Naruto who wants to become a Hokage because he is treated differently from others because of having a monster named nine Tail's spirit inside him who destroyed half of the leaf village( the village in which naruto leaves ). In the anime there are all shinobis which are basically ninjas.
It have approx 220 episodes after this Naruto shipudan starts.

Death Note




This anime is also one of the great anime I have ever seen in this anime there is a notebook in which if anyone writes a name then he will die by the hurt attack and if he mentions the reason for death then the person dies with the mentioned reason. this notebook belongs to shinigami world but the shinigami's are bored so they through this notebook in the human world and a genius student named Light Yagami gets this and with his genius mind he try to eliminate all the criminals from the world. people said him, Kira. This anime is of approx 40 episodes but it was one of the best a suggest you watch it now.

Erased


This anime is also one of my favorite anime it is short anime of approx 12 episodes. and in this somehow a people is able to move from his present time to past time and he is able to change something from past which prevents the future crimes which he seen in the future.
Share:

True RMS Reading Voltmeter

True RMS Reading Voltmeter

The scale of general voltmeters (average reading voltmeters) is calibrated in terms of the RMS value of sinusoidal waveform only. So to read the RMS value of any waveform we use True RMS Reading Voltmeter.

In this The input signal is amplified by an AC amplifier and then the electrical energy is converted in heat energy and the temperature change is measured by the main thermal couple and hence the heat energy is again converted into electrical energy and it is then measured by the PMMC meter. Here a Balancing thermocouple is used to remove the non-linear behavior of the main thermocouple.

Here the electrical energy is converted into heat energy E and we know heat energy is proportional to the square of RMS current and voltage hence the value of RMS voltage is measured by.

Electrical energy -> Heat Energy -> Electrical Energy

Here The DC amplifier is used to amplify the potential difference created by the thermocouple.

Note: thermocouple is a device used to measure the temperature in this in this there is a potential difference is devoloped in the two rods of thermocouple according to the temperature (the two rods are of different material)

Note: The scale of the PMMC meter can be calibrated in terms of the RMS input voltage of any waveform.

Advantage: RMS voltage of any waveform can be measured.

Disadvantage: Uneconomical, chances of error are more due to many parts are present. and Energy loss.

Share:

Why Python is so popular?

Why Python is so popular?

let's discuss why python is so popular.

Python have a clear and readable syntax if you already know to program then you can learn it in very little time.

You require less code to do the same thing compared to other programming languages

It’s a grate starter language because of the huge global community and wealth of documentation.

It is useful for many situations, including data science, AI and machine learning, web development, and IoT devices like the Raspberry Pi.

Large Organizations like IBM, Wikipedia, Google, Yahoo!, CERN, NASA, Facebook, Amazon, Instagram, Spotify and Reddit use python heavily.

It is a powerful general-purpose programming language that can do a lot of things and can be applied to many different classes of problems.

It has a large standard library which provides support for many different tasks like; automation, web scraping, text processing, image processing, machine learning, and data analytics. 

For data science python have scientific computing libraries like Pandas, NumPy, SciPy, and Matplotlib. 

For artificial intelligence TensorFlow, PyTorch, Keras, and Scikit-learn. 

Most interesting it can also be used for Natural Language Processing (NLP) using the Natural Language Toolkit (NLTK). 

The Python language has a code of conduct executed by the Python Software Foundation 

that seeks to ensure safety and inclusion for all, both online and in-person python 

communities. 

There are communities like PyLadies which supports a lot of interested people.

PyLadies is an international mentorship group with a focus on helping more women become 

active participants and leaders in the Python open-source community.

Share:

What is PMMC Instrument?

What is PMMC Instrument?

There are different types of electrical machines which we use in our daily life. Whenever these machines are not working properly then we have to check what is the problem for this purpose sometimes or normally we need to measure some parameters so for this we need some instruments to measure them so then PMMC ( permanent magnet moving coil ) is one of those instruments.

  • PMMC  stands for "permanent magnet moving coil".
  • It is a simple and frequently used instrument on ships with sophisticated names.
  • Used where exact measurement is required also as an aid while maintaining electrical equipment.
  • also known as D'alvanometer. because it is a kind of galvanometer that works on the principle of D'Arsonval
  • These instruments use permanent magnets to create a stationary magnetic field in the coils.
  • Then it is used with the moving coil that is connected to the electric source for generating deflection torque according to the Fleming left-hand rule.

Working of PMMC Instrument

  • Working Principle of PMMC Instrument is simple whenever there is a flow of current in the moving coil inside the stationary magnetic field by permanent magnets there is a force there is a deflection torque generated because we know a magnetic field exerts a force on current-carrying wire inside the magnetic field according to the fleming's left-hand rule. thus there is a deflection torque is produced and we providing a damping force by the spring for maintaining the pointer at equilibrium on desired reading.

Construction of PMMC Instrument

The important parts of PMMC are as follows;

Moving Coil

  •  It is an essential component of the PMMC instrument. This designing of the coil can be done by wounding copper coils on a rectangular block among the magnetic poles.
  •  The rectangular block is made up of Aluminium and can be called Alumnim former rotted into the jeweled bearing. 
  •  So it permits the coil to turn freely.
  •  Once the current is supplied throughout these coils then deflection takes place within the stationary magnetic field and according to the deflection, we measure the voltage or current magnitude. 

Note: The aluminum is a non - metallic former, used to measure the current, and metallic former including high electromagnetic damping is used to calculate the voltage.

Magnet System

  •  It includes two high-intensity magnets otherwise a 'U' Shaped magnet-based design.
  •  The designing of these magnets can be done with Alnico and Alcomax for higher superior field intensity and coercive force.
  • In several designs, an extra soft iron cylinder can be arranged among the magnetic poles to create the field identical; while decreasing air reluctance for increasing the strength of the field.

Control

  • The deflection of the pointer is controlled by the control springs which provide equal opposite torque to balance the pointer in equilibrium position.
  • Springs are fabricated by phosphorous bronze. These springs are arranged among the two jewel bearings.
  • The spring provides the lane to the lead current to supply in and out of the moving coil. 
  • The torque can be controlled mainly due to the delay of the ribbon.

Damping Torque

  • generated by using aluminum cor's movement within the magnetic field.
  • Because of the movement of coil within the magnetic field eddy currents can be generated within the aluminum former. this generates damping force otherwise to torque to resist the motion of the coil.
  • So the deflection of the pointer reduces gradually and the pointer lasts at a permanent position which helps to take the right measurements.

Pointer and Scale

  •  In this instrument, the connection of the pointer is done throughout the moving coil.
  • So the movement of the coil results in the movement of the pointer and hence which gives the reading on the scale.
  • the pointer is made up of a light material so it can easily move with the moving coil.
  • Sometimes there is a chance of parallax error which is decreased by properly arranging the pointer on the blade.


Sources of Error in PMMC

  • temperature effects
  • getting old of the instrument cause error to the main part of the instrument like a magnet, moving coil, and spring.
  • errors can be reduced by connecting swamping resistance in series using moving coil. here swamping resistance is nothing but a resistance with less temperature coefficient which reduces temperature effects on the instrument.

Torque Equation

The equation involved in the PMMC instruments is the torque equation.

                                              Td = NBLdl

where,

'N' is the no. of turns in the coil

'B' is the density of flux within the air gap

'L' and 'd' are vertical as well as horizontal lengths of the surface

'I' is the flow of current in the coil

                                       G = NBLd

The restoring torque can be provided to the moving coil is done by the spring which we say,

                                        Tc = Kθ (‘K’ is the spring constant)

The Final deflection cab be done through the equation Tc = Td

Substitute the values of Tc and Td in the above equation, then we can get

                                        Kθ = NBLdl

we know  G = NBLd

Kθ = Gl

θ= Gl/K

I = (K/G) θ

So we conclude that the deflection torque is directly proportional to the flow of current in the coil in PMMC 


Share:

Performance Characteristics of Digital meters


  • Resolution.
  • Accuracy 
  • Linearity - an equal amount of change in input in the same proportion there will be a change in the output.
  • Settling time
  • Temperature Sensitivity
  • Monotonicity 

 Note: 99% are Digital Voltmeters

Share:

Advantages and Disadvantages of Electronic Instruments


Advantages of Electronic Instruments

  • Low power Consumption
  • High-Frequency Range
  • Better Resolution
  • Storage facility
  • Accuracy is high

Disadvantages of Electronic Instruments.

  • Sensitivity
  • effect of noise is more dominating than analog instruments. (unwanted signals are called noise)
  • electronic instruments/devices are prone to damage due to the loading effect.
  • Digital instruments may lose reliability.


Share:

How to focus Mind?

How to focus Mind?

How to focus many people ask when they want to focus on a certain task and they are unable to do it in their mind many things are going and so they are not able to think properly to and be focussed on the work they are doing.

The measure reason because of which people are unable to focus on a certain task is because they have many tasks to do and they are unable to think which task to do first or they want to do another task but some another task is necessary to be done first.

The simple solution to this problem is to try to be selective don't overload yourself be chosen only those tasks which are necessary for you and try to decrease the number of them to a minimum which you can handle easily and try to finish them on time and whenever you are going to start a task note your time and assign 1hr more than the required time and be cool take a long breath and start finishing it do a single time in a single time and remove all distractions which can diverge your mind. Because our mind is able to do a single task in a single time effectively it is not suitable for multitasking. try to do things with a cool mindset because scientists said our brain performs better when we are cool. 

Share:

Introduction to LASER

Introduction to LASER.

The term LASER is a short form of Light Amplification by Stimulated Emission of Radiation the name signifies that in this amplification of light takes place by forced emission of radiations. The laser is one of the most important discoveries of the century.

T.Maiman was the first man who does the first successful operation of the laser in 1960 using a ruby crystal in the USA. after this the first gas laser was fabricated by Ali Javan and coworkers. from this time different types of lasers started to come forward, lasers using solids, liquids, and gases have been developed. The immense use of lasers from toys to warfare and from welding to surgery has made it very popular.

The LASER light is basically an electromagnetic wave but it has some special characteristics which make it different from ordinary light ;

Directionality: The laser beam is highly directional having almost no divergence (except the diffraction effect ). 

Note: For a typical laser beam divergence is less than 0.01 milliradian, i.e., for a meter of propagation, the spread is less than 0.01mm. 

Monochromaticity: The laser light is nearly monochromatic in nature. In reality, no light is perfectly monochromatic means they can not be defined by single wavelengths instead they are characterized by the spread in frequency and for the laser lights, it is very less than ordinary light so compared to the ordinary light we can say that the LASER light is monochromatic in nature.

Monochromaticity is also taken as a measure of spectral purity. Smaller the value of monochromaticity, the higher the purity of the spectrum.

Coherence: LASER radiation is characterized by a high degree of coherence, both spatial and temporal.In other words, a constant phase relationship exists between the radiation field of laser and light sources at different locations and times. 

It is possible to observe interference effects from two independent laser beams. It is the main feature that distinguishes LASER from ordinary light other characteristics are related to a high degree of coherence.

In general, the coherence or phase between two light waves can vary from point to point (in space) or change from instant to instant( or time ). Thus, these are two types of coherence,

Temporal Coherence: if coherency is maintained with respect to time by source.

Spatial Cohrence: If the two waves maintain a constant phase relationship over any time at different points in space.



Intensity: The laser is highly intense compared to an ordinary source of light.Since the power is concentrated in the small diameter of LASER light so even a small LASER can deliver very high intensity at the focal plane of the lens.







Share:

How to deal with JEE Physics?

How to deal with JEE Physics?

I feel from my experience that many jee aspirants feel difficulty in physics subject some have difficulty to understand some concepts and some have problems with solving questions in exam so today I want to discuss how to think while reading and solving questions of physics.

Whenever you are going to study physics then first be ready and be cool made your mind blank mean don't think about other things and then start thinking in steps to approach the answer you want. First try to understand the problem you have and what you have to do it's a common thing, to know what you have to do and what is asked but I feel many of the students start solving the question without properly understanding the question, and this results in the wrong answers. So first understand the problem and then try to think about what concept is related to the question and how you apply them to the question find a starting point and then move forward in each step you have to think more than you write because the whole jee exam is to check your mental ability so you have to think in a structured way. Many Students think less and do more so, in reality, they don't know what are they doing and so they feel they are doing but it is tuff they leave it then . so once you have understood the problem and know-how and which concept you have to apply then apply and with a cool mind solve the problem with steps in mind and do everything after thinking don't think after doing.

So the conclusion in one line is 

You have to think before doing.


you can ask your doubts in the comment section


Share:

Common mistakes every jee mains aspirant makes

Common mistakes every jee mains aspirant makes

Today we discuss some common mistakes which I think every JEE Mains aspirant makes during his or her preparation ( Note: I write everything from my experience of JEE Mains ).

1. Approx every JEE Mains aspirant during his or her preparation thinks about first completing the jee syllabus fast and practice and revision are done after this and so they rush their syllabus without revision and practice of questions. Some finish this and some of them have approx 90% time spent in this and in the end, they have not enough time to revise things and practice questions and even then they are unable to solve previous year questions. and those who finished the syllabus in time maximum from them don't like to revise everything because of the huge syllabus if we even think about it then it feels like a big burden and we leave some topics and the problem started.


2. Another similar type of mistake the aspirants make is they go in the depth of topics more than the required for jee, going depth of concepts is a good practice, but when you have limited time then we say it is like wasting time or poor utilization of time So try to do only required topics in the required depth. Be selective in nature.

3. Some JEE Aspirants are okay with time management also there determination is exelent and they give more than the required time to their preparation but they don't get satisfactory results . from my thinking the mistake by them is that they give time and they read all the things but they don't grasp it they just passed and read those concepts and so they forget about concepts and unable to link them to different concepts etc . so whenever you read a topic then try to completely understand it and try to use them and also try to look applications in real life it helps you to remember things.

4. The golden rule of preparation is Revision but most JEE aspirants do not revise concepts or are unable to revise due to large syllabus, the mistake they do is that thy think to revise full syllabus at the time of the exam or the whole syllabus together I think it is not possible if you are not revising things regularly. try to add revision in your habit of preparation the revision is more important than your practice I think because the practice of questions is done for clearing and applying the concepts if you remember your concepts then it happens automatically it's my personal experience.

5.The other point I want to add is to try to read NCERT because these books are very important, many of us read them in our schools so it's easy to revise the syllabus from NCERT and from my experience approx 80% or more concepts are from NCERT.



6. After completing the syllabus(approx 60 -70% )  and  Revising NCERT the most important thing is to take tests and analyzing them and making a good strategy to attend exam to score best as you can depending on the paper level and type of questions.

The best strategy can give you the best results than your expectation in your worst condition. So make a good strategy for the exam you can refer to my strategy at My strategy of jee mains

Note: Previous year's papers are a combined solution for building concepts and practice also for revision so don't forget about them try to solve them from the start of your preparation as a real 3hr test.

If you have any doubts then you can ask in the comment section. 

Share:

Strategy to attend JEE Mains exam

 Strategy to attend JEE Mains exam

we all know that only knowledge does not matter in the jee mains examination the strategy of attending paper and time management affects a lot to the result a good strategy to attend paper can boost your scores so let's begin with the strategy which is my own strategy to attend jee mains exam it gives me a very better result on my performance in jee mains.



First 30min of exam

In the jee mains exam the easiest section is chemistry if you know a little about it in the starting give approx 30 min or less to chemistry read all questions of one or two lines most probably if the paper is of the medium category you are able to attend at least 70-80% of the chemistry section in the starting 30 mins (after reading and solving one or two-line questions you can go for the bigger one if you have time left in first 30 mins).

Next 30mins of the exam

For the next 30mins of the exam go for physics and read one or two lines questions first in these 30 mins your target is to attend 10 questions from physics section if you are done with attending the 10 questions early attend more but try to attend at least 10 questions from physics section in this 30 mins don't forget to read smaller questions first (one or two lines don't go for the bigger one if it seems easy then go for it after reading all smaller ones in physics section).

Next 1 hour 

In this hour go for the math section and attend all questions line-wise if you know how to solve leave the question for time if you have doubt or it seems difficult to attend them after you go from the whole math section try to finish all math sections in one hour 

The last hour first 30 mins

In this time go for the physics section again and attend all the remaining questions of physics you know where is the question which topic try to solve the question of the topic on which you have more knowledge or your strong topics once you have finished all questions you know then go for the next 30 mins

The last 30 mins

Go for the chemistry section attend all remaining questions it does not take much time because you attended most of them before after attending them consider the whole paper and go for the questions you have doubt on the answers or the questions which are left etc basically examine your paper and you are finished with the exam.


Benefits

1.you are able to go for the all-easy questions from the whole paper.

2.you utilize your time effectively in the whole exam.

3.Sometimes attending the same section for a large time our brain grows slow and we feel sleepy by switching between sections your brain remains active throughout the exam.

4.you get the best output from your exam if you follow it strictly (some changes are allowed according to the paper but not large).

5.There are many more you can think about them.


Advice

Try to attend at least 10 previous year question papers with this strategy and analyze them in starting you feal trouble but after 3-4 papers you are able to see the result.

Share:

Programing language for data science

Programing language for data science 

The one line answer of the question is depends on the work you want to do each language have its own strength and weaknesses and there is no one answer to the question, depends largely on needs, the problem you are solving, and for who you are solving but the most common languages in the field of data science as per my knowledge and also to start with are Python, R, and SQL.



Python is very famous because it is easy and a large no of things you can do with a little code and its syntax is also easy to read and many more. The language's array-oriented syntax of R makes it easy to translate from math to code this language is used especially by statisticians,mathematicians data analysts etc. and SQL is a language to deal with structured data effectively. 
But there are also some more with their own strengths and weaknesses the most popular of them are Scala , Java C++, and Julia. Javascript, PHP, Go, Ruby, and Visual Basics all have their own use cases. 
The language you use also depends on the company in which you are working and the project you are assigned.
Share:

Why the Big Data is very famous topic these days?


These days Big Data? is a very famous topic of discussion in the technical field.

In past we have data but we don't have sufficient technologies to deal with them so it is not so famous but at present, we have technologies like Hadoop and also we have high computation power which is the measure reason because of the world is looking towards this field with these technologies.



at present we are generating a huge amount of data from various sources like mobile phones, the internet, street cameras, etc. and many big companies realized the importance of data many years ago so they started to store the data due to this at present we have in result huge amount of data already stored and also data coming from different sources this huge amount of data is known as Big Data.

this huge amount of data is analyzed and hidden features of them and the trends are identified and used for the development of business, games, society, etc . for example Once a  company used recorded videos of several matches of basketball games and they analyzed them and they found the places in the ground from where the chances of scoring well are very large and then they improve their team performance on the basis of this and score very great scores in the further games.

So we can see from the above example that the analysis of data, results in making a very great difference



Hence this is the main reason why Big Data is a very famous topic these days. The big companies working in this field realized the importance of data or Big data and they are working in this field to extract useful insights from this so everyone with time feals the importance of data and hence it's a very famous topic.

also, skilled people are needed in this job and demand of these people are much so this is also a very important point which made the Big Data a trending topic.




Share:

What is Hadoop?

 

Traditionally in computation and processing the data you have to bring data to the computer and you require a program for the processing of the data then you bring data to the program and the processing is done by the program. In a big data cluster Larry and Sergey Brin came up with an idea they sliced the data into small pieces and they distribute this data between thousands of computers first it was hundreds but now thousands and now it's ten thousand. And then they send the program to all the computers in the cluster. and each computer process the data and then sends the result back and then the result is combined and the data processing will be done in very little time. 



The first process is known as the map or mapper process and the second one is known as a reduction process. A fairly simple process but turns out very useful to process a large amount of data. Only twice the number of servers and you have twice the efficiency. So the bottleneck of all the major social media companies. 

Yahoo then got on board . Yahoo hired someone named Doug Cutting who had been working on a clone or a copy of the Google big data architecture and now that's called Hadoop. And now the Hadoop becomes a very famous one there are hundreds of thousands of companies out there that have some kind o footprint in the big data world.








Share:

Qualities of Data Scientist


The impotent qualities of a data scientist are curiosity, extremely argumentative, and judgmental.

The most important one is curiosity because if you are not curious you would not know what to do with data. Judgmental because if you have not preconceived notions about things you would not know where to begin with and where to go.Argumentative because if you have this skill then you can argue on your results and you can modify them and now you learn from the data which leads you to a better result.



The other Qualities which a data scientist needs is the comfort and flexibility with some analytic platforms; some software, some computing platform, but that's secondary. The most important thing is curiosity and the ability to take positions. Once you have done that, once you've analyzed .then you've got some answers.



The last quality of a data scientist is the ability to tell a story. Once you have your analytics and your tabulations, now you should be able to tell a great story from it. because everything is worthless if you are not able to explain your findings. your findings will remain hidden, remain buried, nobody would know. Your position in this field largely depends on the ability to tell stories.

The starting point for acquiring the qualities or skills of a data scientist is to decide in which field you are interested and in which field you want to be a data scientist for example lets say you are gaining skills in IT  field and you want to be a data scientist in the health field then in the health field data scientist a different type of skills are required so first decide it and also what is your competitive advantage.



Your competitive advantage is not necessarily going to be your analytical skills .your competitive advantage is the understanding of any field in which you can far away from the other crowd maybe it's film, music, computers, art, etc. Once you figured out this then you can start acquiring your analytical skills. What platforms you have to learn and learn those platforms, those tools would be specific to the industry you are interested in. And then once you have got some proficiency in the tools, the next thing would be to apply your skills to real problems, and then tell the rest of the world what you can do with it.


Share:

The Report Structure


Q.what the report should contain if it is five to six pages or less?

Ans: in this case, the report is more to the point and presents a summary of key findings 

Q.what a long or detailed report contains?.

A detailed report contains arguments and contains details about relevant work, research methodology, data sources, and also intermediate findings along with the main results.



Important constituents of a report.

even if a report is small like 4 to 5 pages it should contain the cover page, table of contents, executive summary, detailed contents , acknowledgments, reference, and appendices( if needed).

Explanation of some important constituents of the report.

1.Cover page:  this page contains the title of the report, names of authors, their affiliations, and contacts, the name of the institutional publisher ( if any ), and the date of publication.

2.Table of contents(ToC): This page contains important topics of your whole report it helps to give a quick overview of what the report contains if the report is a big one then it helps a lot.

3.Abstract or an executive summary: Nothing is more powerful than explaining the crux of your arguments in three paragraphs or less. Of course, for large documents running a few hundred pages, the executive summary could be longer.

4.Introductory section: The introductory help the reader who is new to the topic this section helps them to be familiar with the subject

5.Methodology: this section introduces the research methods and data sources you used for the analysis. if you collected some new data then explain the data collection exercise in detail.

6. Results: in this section, you present your empirical findings. starting with descriptive statistics and illustrative graphics, you will move towards formally testing your hypothesis

7.Discussion section: where you craft your main arguments by building on results you have presented earlier. here you rely on your narrative to enable numbers to communicate your thesis to your readers. You refer the reader to the research question and the knowledge gaps you identified earlier. you highlight your findings to provide the ultimate missing piece to the puzzle.

8.Conclusions: Here you write your conclusions drawn from the report and the future goals of your findings.

9.Reference: Here you write the references for your report.

10.Acknowledgment: acknowledging supports of those who have enabled your work is always good.

11.Appendices: if needed



Share:

What is Data Mining? and steps in Data Mining

 What is Data Mining?

Basically, data mining is the whole process by which we process and extract the required information from the data by using various technics like machine learning, statistics, etc.

The process goes from various steps which are as follows:



1. Establishing Data Mining Goals

The steps in data mining require you to set up goals for the exercise. We have to identify the key questions for which we want to answer the cost and benefits of the exercise are also factors about which we have to think. Furthermore, we have to determine in advance the expected level of accuracy and usefulness of the result obtained from data mining. If money were no object you can throw as many funds as necessary to get the desired answer required however the cost benefits are always a measure feature in determining the key goals of data mining. the level of accuracy from the data mining also affects the cost and vise vers furthermore beyond a certain level of accuracy more accuracy not affect the result much. the cost-benefit trade-offs for the desired level of accuracy are important considerations for the data mining goals.


2. Selecting Data

The output of the data mining exercise largely depends upon the quality of data being used for example at times data is readily available for processing .for instance, retailers often possess large databases of customer purchases and demographics \.On the other hand, data may not be readily available for processing in this case we have to use data effectively, collect data by various activities like performing surveys, etc which affects the cost of the data mining exercise. So the type of data, its size, and frequency of data collection have a direct bearing on the cost of data mining exercise Therfor the type of data needed for the data mining exercise that can answer the data at a reasonable cost is critical.


3. Preprocessing Data

Preprocessing of data is an important part of data mining in this we deal with data to find errors in data or irrelevant data. Sometimes in relevant data, the information is missing. So in preprocessing stage you remove the irrelevant data and do flagging as such necessary in erroneous data. Sometimes human errors are also included. Data should be subject to checks to ensure integrity. Lastly, we have to develop a formal method of dealing with the missing data and determine whether the data are missing randomly or systematically.

if the data were missing randomly, a simple set of solutions would suffice .and However if the missing data miss in a systematic way we have to identify how it affects the result 


4. Transforming Data

After the relevant attributes of data are retained, the next step is to determine appropriate formate for them in which data must be stored we are focused on reducing the no of attributes needed to explain the phenomena. This may require transforming data, data reduction algorithms, such as Principal Component Analysis, can reduce any of the attributes without the loss of a significant amount of data, for example, we can group income from all sources of family in the term aggregate family income. 

 Often you need to transform variables from one type to another. It may be prudent to transform the continuous variable for income into a categorical variable where each record in the database is identified as low, medium, and high-income individual. This could help capture the non-linearities in the underlying behaviors.


5. Storing Data

The transformed data must be stored in the formate such as it becomes conducive for data mining. The data must be stored in a format such that it gives immediate and unrestricted read and writes facility to the data scientists. the new variables can be written easily and the searching algorithms for the data mining processes do not need to do unnecessary searching on different servers and also privacy, security, and safety all are important points in which we have to focus on the data storing steps.


7. Mining Data

The data mining process covers the steps of data analysis methods, including parametric and non-parametric methods and machine-learning algorithms. A good starting point for data mining is data visualization. Multidimensional views of the data using advanced graphing capabilities of data mining software are very helpful in developing a preliminary understanding of the trends hidden in the data set. 


8. Evaluating Mining Results

After results have been extracted from data mining we do a formal evaluation of results. Formal evaluation could include testing the predictive capabilities of the models on observed data to see how effective and efficient the algorithms have been in reproducing data. This is known as an in-sample forecast. then the results are shared with the key stakeholders for the feedback and later iterations are followed for the data mining to improve the process.

data mining and evaluating the results become an iterative process such that the analysts use better and improved algorithms to improve the quality of results generated in light of the feedback received from the key stakeholders.

                                                                                                                           --Reference IBM data                                                                                                                                      science professional                                                                                                                                      course at  Coursera

Share:

What is Big Data? and its v's

 

What is Big Data?


What is Big Data it is a very famous question in 2021. So what is it? On a daily basis there is a vast amount of data produced by us it is coming from various sources for example mobile phones, YouTube, google search, social media like Facebook, Instagram, sensors information, security cameras, sports data, etc. all this data is very large such that it is very difficult to handle with traditional technics.There's even a name for it: Big Data. By Ernst and Young big data is defined as : 








"Big Data refers to the dynamic, large and
disparate volumes of data being created by people, tools, and machines.
It requires new, innovative, and scalable technology to collect, host, and analytically
process the vast amount of data gathered in order to derive real-time business insights
that relate to consumers, risk, profit, performance, productivity management, and enhanced shareholder
value" 

What are 5 Vs of Big Data?

We have no fixed definition of big data but in all cases, certain elements are common such as velocity, volume, variety, veracity, and value. These are known as Vs of Big Data.


What is the Velocity of Big Data?: The velocity of big data is the speed at which data accumulates is known as the velocity of big data. Data is generated at a large speed in each second in a process that has no end in the entire world in different forms for example YouTube each action on the internet or in front of a camera generates data so you can think how large is the world and how much population it has and maximum peoples at this time are using digital devices and plays roll in data generation.



What is Volume of Big Data? : Volume of Big data is the scale by which data is generating or in the increase in the data stored. It is one of the most important v's of big data because volume is a term nearly related to big data sometimes the big data is defined on the basis of how much volume it has. At present time we can store a huge amount of data in small chips which leads us to the capability of storing such a huge amount of data which leads to the term big data.
  

What is the Variety of Big Data?: Variety of Big Data is the diversity of data like structured data in the form of tables in rows and columns in relational databases and unstructured data which is not organized in an organized way like in the form of emails social media, YouTube video's, blogs, business decision etc. it also reflects data from different sources like mobiles phones, machines, video, processes, security cameras, etc.



What is Veracity of Big Data ?: Veracity is is the quality and origin of data and its conformity to facts and accuracy. Attributes include consistency, completeness, integrity, and ambiguity. Today we have a huge amount of data so it's a common question to ask whether the data is true, false and accurate, or not etc, But since the data is huge so we face trouble in struggling with them. And time to time modifications is made to deal with the problem.



What is the Value of Big Data?: Value of Big Data is the ability and need to turn data into the value it does not just profit or loss it is in any form like medical and social value which gives benefit to the customer. most of the time data scientists are dealing with data to extract value from them. It's a common answer if we have no value or benefit from the data then we are not dealing with this so the most important point we have discussed is we deal with data because we want to extract value from data. 



Share:

What is Data Science?


Let's Discuss first what is data? we can say data is any type of information may be stored or not but any type of information is known as data and the study of this data is known as data science. So It is the one-line answer to the question, What is Data Science?

What is Data Science? Data science is basically the study of data. The term data science is also defined by others in many forms like the size of data or some skills which are compulsory to be a data scientist but basically, data science is related to the study of data to extract meaningful information from the data.

When we deal with data we encounter many types of data which are maybe structured data(like in form of tables etc.) or unstructured data (from emails social media etc.) when we deal with this data we require different types of skills to deal with them efficiently here the terms statistics, programming skills, mathematics etc comes. 

At present time we have a vast amount of data coming from different sources every second like emails, social media, log files, patent information, sports data, sensor information, security cameras, etc this huge amount of data leads to the term big data.

 Present Scenario of Data Science

At present time the data science is a very popular term because we have a huge amount of data and we have the huge computing power and efficient algorithms to deal with them but most of the companies don't know how to deal with data effectively and they pay a huge amount to persons who are aware from the topic Data science. To be a data scientist essential skills are curiosity, extremely argumentative and judgmental, and also proficient in some tools to deal with data effectively. 

Share:

Who can become a data scientist?


There are many people who have doubts that who can become a data scientist what are the qualifications needed to become a data scientist like, skills, traits and all so its clear that data science is not a field like medical and engineering or like computer engineer, etc it is not like that to think in childhood that I want to become a data scientist or I want to do something in the field of data science.



5-6 years back there was no name like data science or no one knew about data science and anything like that it is an emerging field now and anyone who is working in it or measure of them have jumped in this field automatically with time or realized with time that they are interested in this field and want to do further work in this field so they are doing that many of them are from different background like statistics, art, science field, some are musicians or others who are initially tried to work in different fields and with time they realized that they are interested and want to make their career in this field.



So from the above discussion, we can say that there is not a fixed field or qualifications required to be a data scientist the only thing matters is your interest in this field, and also first you should know what is this field and why it is so famous now.

Further, if you are interested in this field and want to work in this field then certain qualities are required to be a data scientist you can read about them further here - Qualities needed of a data scientist 

Share:

Wien's Bridge

 Wien's Bridge

It is primarily known as the frequency-determining bridge And is described here not only its use in ac bridges but it also has used in various other useful circuits.

The Wien's Bridge may also be employed in a harmonic distortion analyzer, where it is used as a notch filter, discriminating against one specific frequency.

It has also application in audio and HF oscillators as the frequency-determining device.



from fig at balance we have:

( R1/( 1 jwC1R1 )).R4 = (R2 - j/wC2).R3

Solving and Equating real and imaginary parts we get

and    wC1R2 - 1/wC2R1 = 0 from which w = 1/(R1.R2.C1.C2)^(1/2)

and frequency  


 In most Wien's bridge , the components are so chosen that 

                 R1 = R2 = R          and    C1 = C2 = C

          THe eq reduces to : R4/R3 = 2 

and              f = 1/(2.pi.R.C)

Switch resistors R1 and R2 are mechanically liked so as to fulfill the condition R1 = R2

As long as C1 = C2  are fixed capacitors and R4 = 2R3, the Wien's bridge may be used as a frequency-determining device, balanced by a single control. This control may be directly calibrated in terms of frequency.

This bridge is suitable for the measurement of frequency in the range of 100Hz to 100KHz  with an accuracy of 0.1 to 0.5 percent.

Because of frequency sensitivity, The Wien's bridge is difficult to balance unless the waveform is perfectly sinusoidal in nature.

The bridge is not balanced for any harmonics present in the applied voltage, Thus this harmonics will sometimes produce an output voltage masking the true balance point. This difficulty can be overcome by connecting a filter in series with a null detector.

Wien's bridge may be used for the measurement of capacitance also.

Share:

Campbell's Bridge

 Campbell's Bridge

This bridge measures unknown mutual inductance in terms of standard mutual inductance

Let   M1 = unknown mutual inductance,

         L1 = self-inductance of secondary of mutual inductance M1,

         M2 = variable standard mutual inductance,

          L2 = self-inductance of secondary of mutual inductance M2,

and   R1,R2,R3,R4 = non - inductive resistances.



There are two steps in balancing the process.

1.    Detacr is connected between b and d and the balanced point is obtained then the requirement of balance is :

                            L1/L2 = R1/R2 = R3/R4

This bridge may be balanced by adjustment of R3 ( or R4) and R1 ( or R2).

2.    Then we connected between b' and d' . Keeping adjustment in step1  then variable mutual inductance M2 is varied to get balance point Then 

                            M1/M2 = R3/R4

                            M1 = M2.( R3/R4 )

Share: