Big Data To Avoid Weather Related Flight Delays


Download your Full Reports for Big Data To Avoid Weather Related Flight Delays

Big Data To Avoid Weather Related Flight Delays Full Seminar Report, abstract and Presentation download

This paper identifies key aviation data sets for operational analytics, presents a methodology for application of big-data analysis methods to operational problems, and offers examples of analytical solutions using an integrated aviation data warehouse. Big-data analysis methods have revolutionized how both government and commercial researchers can analyze massive aviation databases that were previously too cumbersome, inconsistent or irregular to drive high quality output. Traditional data-mining methods are effective on uniform data sets such as flight tracking data or weather. Integrating heterogeneous data sets introduces complexity in data standardization, normalization, and scalability.

The variability of underlying data warehouse can be leveraged using virtualized cloud infrastructure for scalability to identify trends and create actionable information. The applications for big-data analysis in airspace system performance and safety optimization have high potential because of the availability and diversity of airspace related data. Analytical applications to quantitatively review airspace performance, operational efficiency and aviation safety require a broad data set. Individual information sets such as radar tracking data or weather reports provide slices of relevant data, but do not provide the required context, perspective and detail on their own to create actionable knowledge.
These data sets are published by diverse sources and do not have the standardization, uniformity or defect controls required for simple integration and analysis. At a minimum, aviation big-data research requires the fusion of airline, aircraft, flight, radar, crew, and weather data in a uniform taxonomy, organized so that queries can be automated by flight, by fleet, or across the airspace system.


Recent years have witnessed a dramatic increase in our ability to collect data from various sensors, devices, in different formats, from independent or connected applications. This data ood has outpaced our capability to process, analyze, store and understand these datasets. Consider the Internet data. The web pages indexed by Google were around one million in 1998, but quickly reached 1 billion in 2000 and have already exceeded 1 trillion in 2008. This rapid expansion is accelerated by the dramatic increase in acceptance of social networking applications, such as Facebook, Twitter, Weibo, etc., that allow users to create contents freely and amplify the already huge Web volume.


Furthermore, with mobile phones becoming the sensory gateway to get realtime data on people from different aspects, the vast amount of data that mobile carrier can potentially process to improve our daily life has significantly outpaced our past CDR (call data record) based processing for billing purposes only. It can be foreseen that Internet of things (IoT) applications will raise the scale of data to an unprecedented level. People and devices (from home cofee machines to cars, to buses, railway stations and airports) are all loosely connected. Trillions of such connected components will generate a huge data ocean, and valuable information must be discovered from the data to help improve quality of life and make our world a better place. For example, after we get up every morning, in order to optimize our commute time to work and complete the optimization before we arrive at once, the system needs to process information from trafic, weather construction, police activities to our calendar schedules, and perform deep optimization under the tight time constraints.


In all these applications, we are facing significant challenges in leveraging the vast amount of data, including challenges in (1) system capabilities (2) algorithmic design (3) business models.


The students from the University of Michigan have started a new research which helps in understanding the weather of a particular place. They have taken data of the weather of the past 10 years. The analysis of this data helps in understanding the patterns in the weather. This is a very creative and new process. It could lead to understanding similarities in the weather in the past years. It could be of help in predicting the weather in the future. This can be very helpful for flights. With the help of this data, the flights can be cautious of bad weather in advance. So it will be usefull.


• Understanding and Targeting Customers

• Understanding and Optimizing Business Processes

• Personal Quantification and Performance Optimization

• Improving Healthcare and Public Health

• Improving Sports Performance

• Improving Science and Research

• Optimizing Machine and Device Performance

• Improving Security and Law Enforcement.




It concludes that the new technology Big data Computng can be used for weather forecasting process. Data Mining in field of big data compute accurate future weather. The system increases the accuracy ,reliability and consistency of identification and interpretation of weather . It also concludes that the BackPropagation Algorithm can also be applied on the forecasting weather data. Neural Networks are capable of modeling a weather forecast system. Which overall help airlines to avoid flight delays and cancellation of flight

Download your Full Reports for Big Data To Avoid Weather Related Flight Delays Big Data To Avoid Weather Related Flight Delays

Tags :


© 2013 All Rights Reserved.