Big Data: Data Science Applications and Present Scenario
|International Journal of Engineering Trends and Technology (IJETT)||
|© 2019 by IJETT Journal|
|Year of Publication : 2019|
|Authors : Shubhankar Chaturvedi and Shwetank Kanava
|DOI : 10.14445/22315381/IJETT-V67I1P210|
MLA Style: Shubhankar Chaturvedi and Shwetank Kanava "Big Data: Data Science Applications and Present Scenario" International Journal of Engineering Trends and Technology 67.1 (2019): 57-59.
APA Style:Shubhankar Chaturvedi and Shwetank Kanava (2019). Big Data: Data Science Applications and Present Scenario. International Journal of Engineering Trends and Technology, 67(1), 57-59.
In this paper we are presenting some simple study of data science which has been discussed very frequently in scientific community. We are also giving some recent trends and techniques and their impact on scientific as well as social community.
 J. Abello, P. M. Pardalos, and M. G. Resende, Handbook of Massive Data Sets, ser. Massive Computing. Springer, 2002, vol.
 C. C. Aggarwal and P. S. Yu, Eds., Privacy-Preserving Data Mining: Models and Algorithms. Springer, 2008.
 C. C. Aggarwal, Data streams: Models and algorithms. Springer, 2007, vol. 31.
 Australian Department of Immigration, ?Fact sheet 70 - managing the border,? Internet, 2013. [Online]. Available: http://www.immi.gov.au/media/factsheets/70border.htm
 J. Bader and E. Zitzler, ?HypE: An algorithm for fast hyper volume-based many objective optimization,? Evolutionary Computation, vol. 19, no. 1, pp. 45– 76, 2011.
 P. Baldi and P.J. Sadowski, ?Understanding drop out, ?in Advances in Neural Information Processing Systems26. Cambridge, MA: MIT Press, 2013, pp. 2814–2822.
 M. Banko and E. Brill, ?Scaling to very very large corpora for natural language disambiguation,? in Proceedings of the 39th Annual Meeting of the Association for Computational Linguistics, Toulouse, France, 2001, pp. 26–33.
 A.-L. Barab´asi, Linked: The New Science Of Networks. Basic Books, 2002.
 M. Basu and T. K. Ho, Data Complexity in Pattern Recognition. London, UK: Springer, 2006.
 Y. Bengio, ?Learning deep architectures for AI,? Foundations and Trends in Machine Learning,vol.2,no.1,pp. 1–127, 2009.
 E. Brynjolfsson, L. Hitt, and H. Kim, ?Strength in numbers: How does data-driven decision making affect firm performance?? Available at SSRN 1819486, 2011.
 T. Chai, Y. Jin, and S. Bernhard, ?Evolutionary complex engineering optimization: Opportunities and challenges,? IEEE Computational Intelligence Magazine, vol. 8, no. 3, pp. 12–15, 2013.
 N. V. Chawla, ?Data mining for imbalanced datasets: Anover view,?in Data Mining and Knowledge Discovery Handbook, O. Maimon and L. Rokach, Eds. US: Springer, 2005, pp. 853–867.
 N. V. Chawla et al., Learning on Extremes-Size and Imbalance-of Data. Florida, US: University of South Florida, 2002.
 Q. Da, Y. Yu, and Z.-H. Zhou, ?Learning with augmented class by exploiting unlabeled data,? in Proceedings of the 28th AAAI Conference on Artificial Intelligence, Quebec City, Canada, 2014.
 K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, ?A fast and elitist multiobjective genetic algorithm: NSGA-II,? IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002.
 T. G. Dietterich, ?Approximate statistical tests for comparing supervised classification learning algorithms,? Neural Computation, vol. 10, no. 7, pp. 1895– 1923, 1998.
 D. Easley and J. Kleinberg, Networks, crowds, and markets. Cambridge Univ Press, 2010.
 A. Farhangfar, L. Kurgan, and J. Dy, ?Impact of imputation of missing values on classification error for discrete data,? Pattern Recognition, vol. 41, no. 12, pp. 3692–3705, 2008.
 L. Feng, Y.-S. Ong, I. Tsang, and A.-H. Tan, ?An evolutionary search paradigm that learns with past experiences,? in IEEE Congress on Evolutionary Computation, Brisbane, QLD, Australia, 2012, pp. 1–8.
 M. M. Gaber, A. Zaslavsky, and S. Krishnaswamy, ?Mining data streams: A review,? ACM SIGMOD Record, vol. 34, no. 2, pp. 18–26, 2005.
 W. Gao, R. Jin, S. Zhu, and Z.-H. Zhou, ?One-pass AUC optimization,? in Proceedings of the 30th International Conference on Machine Learning, Atlanta, GA, 2013, pp. 906–914.
 W. Gao and Z.-H. Zhou, ?Dropout Rademacher complexity of deep neural networks,? CORR abs/1402.3811, 2014.
 J. A. Hanley and B. J. McNeil, ?A method of comparing the areas under receiver operating characteristic curves derived from the same cases,? Radiology, vol. 148, no. 3, pp. 839–843, 1983.
 G. E. Hinton, P. Dayan, B. J. Frey, and R. M. Neal, ?The ?wake-sleep? algorithm for unsupervised neural networks,? Science, vol. 268, no. 5214, pp. 1158– 1161, 1995.
 G. E. Hinton and R. R. Salakhutdinov, ?Reducing the dimensionality of data with neural networks,? Science, vol. 313, pp. 504–507, 2006.
 H. Ishibuchi, M. Yamane, N. Akedo, and Y. Nojima, ?Many-objective and many-variable test problems for visual examination of multiobjective search,? in IEEE Congress on Evolutionary Computation, Cancun, Mexico, 2013, pp. 1491–1498.
 H. Ishibuchi and T. Murata, ?Multi-objective genetic local search algorithm,? in Proceedings of IEEE International Conference on Evolutionary Computation, Nagoya, Japan, 1996, pp. 119–124.