Call for Abstract
World Congress on Computer Science, Machine Learning and Big Data, will be organized around the theme “An Innovative Outlook About The Emerging Trends In Computer Science”
Computer Science Meet 2018 is comprised of keynote and speakers sessions on latest cutting edge research designed to offer comprehensive global discussions that address current issues in Computer Science Meet 2018
Submit your abstract to any of the mentioned tracks.
Register now for the conference by choosing an appropriate package suitable to you.
Computer Science Technology forms the technological infrastructure of modern commerce. Computer Technology is an ever-evolving, expanding field. It's the driving force of every industry and permeates everyday life. The ability to combine the power of computing with the management of multimedia information is arguably the key to obtain ascendancy in any field.
- Track 1-1Scientific computing
- Track 1-2Computer graphics
- Track 1-3Algorithmic trading
- Track 1-4Simulation
- Track 1-5Human-Computer Interaction
Machine learning is a type of artificial intelligence (AI) which allow software applications to become more accurate in predicting outcomes without being explicitly programmed. The basic idea of machine learning is to compile algorithms which can receive input data and use statistical analysis to foresee an output value within an satisfactory range.
- Track 2-1Machine learning algorithms
- Track 2-2Supervised learning
- Track 2-3Unsupervised learning
Deep learning associates developments in computing power and special types of neural networks to study complex patterns in huge amounts of data. Deep learning techniques are presently state of the art for identifying objects in images and words in sounds. Researchers now look forward to applying these successes in pattern recognition to more complex tasks such as automatic language translation, medical diagnoses and numerous other important social and business problems.
- Track 3-1How to build neural networks
- Track 3-2Convolutional networks
- Track 3-3RNNs, LSTM, Adam, Dropout, BatchNorm
- Track 3-4Xavier/He initialization
AI or artificial intelligence is the simulation of human intelligence processes by machines, especially computer systems. These processes include learning (the acquisition of information and rules for using the information), reasoning (using the rules to reach approximate or definite conclusions), and self-correction. Applications of AI include expert systems, speech recognition and machine vision. Today, it is an umbrella term that encompasses everything from robotic process automation to actual robotics. It has gained prominence recently due, in part, to bigdata, or the increase in speed, size and variety of data businesses are now collecting. AI can perform tasks such as identifying patterns in the data more efficiently than humans, enabling businesses to gain more insight out of their data.
- Track 4-1Robotic process automation
- Track 4-2Machine vision
- Track 4-3Natural language processing
- Track 4-4Robotics
A.I. is being used today by businesses both big and small. How much of an effect will this technology have on our future lives and what other ways will it seep into day-to-day life? When A.I. really blossoms, how much of an improvement will it have on the current iterations of this so-called technology?
- Track 5-1AI in healthcare
- Track 5-2AI in business
- Track 5-3AI in education
- Track 5-4AI in finance
- Track 5-5AI in manufacturing
Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. But it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analyzed for insights that lead to better decisions and strategic business moves. The amount of data that’s being created and stored on a global level is almost inconceivable, and it just keeps growing. That means there’s even more potential to glean key insights from business information – yet only a small percentage of data is analyzed. What does that mean for businesses? How can they make better use of the raw information that flows into their organizations every day?
- Track 6-1Streaming data
- Track 6-2Social media data
- Track 6-3Publicly available sources
- Track 6-4Data Exploration & Visualization
- Track 6-5Importance of Big data
- Track 6-6Applications of Big data
Artificial Intelligence (AI), mobile, social and Internet of Things (IoT) are driving data complexity, new forms and sources of data. Big data analytics is the use of advanced analytic techniques against very large, diverse data sets that include structured, semi-structured and unstructured data, from different sources, and in different sizes from terabytes to zettabytes. Analyzing big data allows analysts, researchers, and business users to make better and faster decisions using data that was previously inaccessible or unusable. Using advanced analytics techniques such as text analytics, machine learning, predictive analytics, data mining, statistics, and natural language processing, businesses can analyze previously untapped data sources independent or together with their existing enterprise data to gain new insights resulting in better and faster decisions.
- Track 7-1Big data Hadoop
- Track 7-2Apache
- Track 7-3Scala
- Track 7-4Spark
Data mining can be considered a superset of many different methods to extract insights from data. It might involve traditional statistical methods and machine learning. Data mining applies methods from many different areas to identify previously unknown patterns from data. This can include statistical algorithms, machine learning, text analytics, time series analysis and other areas of analytics. Data mining also includes the study and practice of data storage and data manipulation.
- Track 8-1High performance data mining algorithm
- Track 8-2Data Mining in Healthcare data
- Track 8-3Medical Data Mining
- Track 8-4Advanced Database and Web Application
- Track 8-5Data mining and processing in bioinformatics, genomics and biometrics
The cornerstone of data analytics in cloud computing is cloud computing itself. Cloud Computing is built around a series of hardware and software that can be remotely accessed through any web browser. Usually files and software is shared and worked on by multiple users and all data is remotely centralized instead of being stored on users’ hard drives.
- Track 9-1IoT on Cloud Computing
- Track 9-2Fog Computing
- Track 9-3Cognitive Computing
- Track 9-4Mobile Cloud Computing
Businesses have long used data analytics to help direct their strategy to maximize profits. Ideally data analytics helps eliminate much of the guesswork involved in trying to understand clients, instead systemically tracking data patterns to best construct business tactics and operations to minimize uncertainty. Not only does analytics determine what might attract new customers, often analytics recognizes existing patterns in data to help better serve existing customers, which is typically more cost effective than establishing new business. In an ever-changing business world subject to countless variants, analytics gives companies the edge in recognizing changing climates, so they can take initiate appropriate action to stay competitive. Alongside analytics, cloud computing is also helping make business more effective and the consolidation of both clouds and analytics could help businesses store, interpret, and process their big data to better meet their clients’ needs.
- Track 10-1Software as a service (SaaS)
- Track 10-2SaaS examples
- Track 10-3Best uses of Data analytics in cloud
- Track 10-4Future of Data analytics in cloud
Distributed computing is a sort of Internet-based imagining that gives shared handling resources and information to PCs and unlike devices on concentration. It is a typical for authorizing pervasive, on-interest access to a common pool of configurable registering assets which can be quickly provisioned and discharged with insignificant administration exertion. Distributed calculating and volume preparations supply clients and ventures with different abilities to store and procedure their info in outsider info trots. It depends on sharing of assets to accomplish rationality and economy of scale, like a utility over a system.
- Track 11-1Microsoft Azure Cloud Computing
- Track 11-2Amazon Web Services
- Track 11-3Google Cloud
- Track 11-4Cloud Automation and Optimization
- Track 11-5High Performance Computing (HPC)
- Track 11-6Emerging Cloud Computing Technology
Business intelligence (BI) is a technology-driven process for analyzing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions. BI encompasses a wide variety of tools, applications and methodologies that enable organizations to collect data from internal systems and external sources; prepare it for analysis; develop and run queries against that data; and create reports, dashboards and data visualizations to make the analytical results available to corporate decision-makers, as well as operational workers.
- Track 12-1Why BI is important?
- Track 12-2Types of BI tools
- Track 12-3BI trends
- Track 12-4BI for Big data
SAP is an ERP (Enterprise Resource Planning) software while SAS is an analytics package developed by SAS (Statistical Analysis System) institute. It was founded by James Goodnight and several Colleagues in 1976 from North Carolina State University. Currently, it is used as an integration of software products that enables anyone to perform: Data Manipulation, Statistical and mathematical analysis, Planning, forecasting and decision Support, Report Writing and Graphics, Quality Improvement, Applications Development, Web Reporting, Data Entry, Retrieval, and Management, Data Warehousing and Data Mining. SAS runs on both Windows and UNIX platforms. It is used in a wide range of industries such as healthcare, education, financial services, life sciences etc.
- Track 13-1SAS Administrators
- Track 13-2Customer Intelligence
- Track 13-3Data Management
- Track 13-4Risk Management
- Track 13-5Fraud & Security Intelligence
- Track 13-6Data Visualization
New intelligent things generally fall into three categories: robots, drones and autonomous vehicles. Each of these areas will evolve to impact a larger segment of the market and support a new phase of digital business but these represent only one facet of intelligent things. Existing things including Internet of Things (IoT) devices will become intelligent things delivering the power of AI enabled systems everywhere including the home, office, factory floor, and medical facility. the forthcoming revolution of the Internet-of-Things (IoT) and resulting interconnectedness of smart home technology for years.
Internet of Things (IoT) is an ecosystem of connected physical objects that are accessible through the internet. The ‘thing’ in IoT could be a person with a heart monitor or an automobile with built-in-sensors, i.e. objects that have been assigned an IP address and can collect and transfer data over a network without manual assistance or intervention. The embedded technology in the objects helps them to interact with internal states or the external environment, which in turn affects the decisions taken. IoT – and the machine-to-machine (M2M) technology behind it – are bringing a kind of “super visibility” to nearly every industry. Imagine utilities and telcos that can predict and prevent service outages, airlines that can remotely monitor and optimize plane performance, and healthcare organizations that can base treatment on real-time genome analysis. The business possibilities are endless.
- Track 14-1Why IOT?
- Track 14-2What is the scope of IOT?
- Track 14-3How can IOT help?
Virtual reality (VR) and Augmented reality (AR) transform the way individuals interact with each other and with software systems creating an immersive environment. For example, VR can be used for training scenarios and remote experiences. AR, which enables a blending of the real and virtual worlds, means businesses can overlay graphics onto real-world objects, such as hidden wires on the image of a wall. Immersive experiences with AR and VR are reaching tipping points in terms of price and capability but will not replace other interface models. Over time AR and VR expand beyond visual immersion to include all human senses. Enterprises should look for targeted applications of VR and AR through 2020.
- Track 15-1Computer-mediated reality
- Track 15-2Object recognition
- Track 15-3virtual fixture