Artificial Intelligence Archives - Fresh Gravity https://www.freshgravity.com/insights-blogs/category/artificial-intelligence/ Tue, 21 Jan 2025 06:58:51 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.freshgravity.com/wp-content/uploads/2024/12/cropped-Fresh-Gravity-Favicon-without-bg-32x32.png Artificial Intelligence Archives - Fresh Gravity https://www.freshgravity.com/insights-blogs/category/artificial-intelligence/ 32 32 Microsoft Fabric – A Unified View of the Modern Data Landscape https://www.freshgravity.com/insights-blogs/microsoft-fabric-a-unified-view-of-the-modern-data-landscape/ https://www.freshgravity.com/insights-blogs/microsoft-fabric-a-unified-view-of-the-modern-data-landscape/#respond Mon, 25 Nov 2024 12:55:27 +0000 https://www.freshgravity.com/?p=3161 Written By Siddharth Mohanty, Sr. Manager, Data Management Stepping into The Future With AI  The future is AI.   From easy-to-use copilot experiences to custom generative AI solutions, every organization today is exploring how they can best utilize AI. However, as businesses get ready for an AI-powered future, they will also require clean data to power […]

The post Microsoft Fabric – A Unified View of the Modern Data Landscape appeared first on Fresh Gravity.

]]>
Written By Siddharth Mohanty, Sr. Manager, Data Management

Stepping into The Future With AI 

The future is AI.  

From easy-to-use copilot experiences to custom generative AI solutions, every organization today is exploring how they can best utilize AI.

However, as businesses get ready for an AI-powered future, they will also require clean data to power AI. It takes a well-orchestrated data estate that can support everything from specialized AI initiatives to scalable AI solutions that span the whole organization to foster game-changing AI innovation. This is a challenging prospect for most organizations whose data environments have grown organically over time with specialized and fragmented solutions. A complex data estate leads to data sprawl and duplication, infrastructure inefficiencies, limited interoperability, and data exposure risks.  

Data leaders who wish to help businesses streamline and advance their data estate must evaluate thousands of data and AI offerings, select the best services, figure out how to integrate them, and do all of this in a way that is flexible and scalable enough to change as the business grows.  

Microsoft Fabric has eliminated the need for spending time integrating specialist solutions and managing complex data estate by introducing a unified stack of end-to-end analytics and data platforms.  

Below is what we, at Fresh Gravity, envision the transition from a fragmented technology stack to a unified platform would look like with Microsoft Fabric. 

 Microsoft Fabric – Key Features 

The Microsoft Fabric platform is the unified foundation of Fabric—an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need. Secure and governed by default, Fabric provides a unified Software as a Service (SaaS) experience, a unified billing model, and a lake-centric, open, and AI-powered framework for your data analytics. Listed below are all capabilities that get implemented by MS Fabric.

 

Microsoft Fabric Capabilities

Microsoft Fabric Implementation Use Cases 

With its unified architecture, Microsoft Fabric can implement all Data Management and Data Science use cases. Listed below are some key implementation use cases – 

Select Use Cases 

Fresh Gravity POV for a Data Platform Implementation 

As part of expanding on the MS Fabric capabilities, at Fresh Gravity, we have recently designed and built an in-house mini-data platform for ingesting data files from various sources, landing the data to a landing zone on Fabric, processing and transforming the data using the Medallion Data Lakehouse architecture and finally serving the data for consumption via Power BI. 

Key Features of our mini-data platform: 

  • Sets up workspaces in Power BI Fabric license 
  • Sets up OneLake Lakehouse 
  • Builds Data Factory copy pipelines to read data from Azure BLOB, Snowflake, and SQL Server and land the data to a transient landing zone on OneLake 
  • Uses Pyspark notebooks to read data from the landing zone to the bronze table on the OneLake Lakehouse 
  • Uses Pyspark to perform transformations, cleansing, and standardizations as needed to load the silver table. At Silver, the notebooks apply canonical data models, normalizations, SCD Type 1, SCD Type 2, etc 
  • Uses the Gold table as data mart tables with domain aggregates for reporting purposes 
  • Added flexibility allows a separate data flow built-in for ad hoc analysis of the raw files landing on OneLake lakehouse which can be further used via visual query for reporting in PowerBI 

Below is the architecture diagram of the mini-data platform built on MS Fabric –  

Architecture diagram of the mini-data platform built on MS Fabric

With newer Microsoft Fabric services becoming GA releases, Fresh Gravity is working proactively to stay ahead by building and deploying real-life data projects on MS Fabric. Stay tuned as we continue to share similar blogs and thought leadership content on various other aspects of Microsoft Fabric.  

To learn more about our data project implementations, best practices, and regulatory-compliant solution designs using industry-standard tools and services, please write to us at info@freshgravity.com

The post Microsoft Fabric – A Unified View of the Modern Data Landscape appeared first on Fresh Gravity.

]]>
https://www.freshgravity.com/insights-blogs/microsoft-fabric-a-unified-view-of-the-modern-data-landscape/feed/ 0
A Deep Dive into the Realm of AI, ML, and DL https://www.freshgravity.com/insights-blogs/a-deep-dive-into-ai-ml-and-dl/ https://www.freshgravity.com/insights-blogs/a-deep-dive-into-ai-ml-and-dl/#respond Wed, 03 Jan 2024 14:50:54 +0000 https://www.freshgravity.com/?p=1582 Written By Debayan Ghosh, Sr. Manager, Data Management In today’s fast-paced world, where information travels at the speed of light and decisions are made in the blink of an eye, a silent revolution is taking place. Picture this: You’re navigating through the labyrinth of online shopping, and before you even type a single letter into […]

The post A Deep Dive into the Realm of AI, ML, and DL appeared first on Fresh Gravity.

]]>
Written By Debayan Ghosh, Sr. Manager, Data Management

In today’s fast-paced world, where information travels at the speed of light and decisions are made in the blink of an eye, a silent revolution is taking place. Picture this: You’re navigating through the labyrinth of online shopping, and before you even type a single letter into the search bar, a collection of products appears, perfectly tailored to your taste. You’re on a video call with a friend, and suddenly, in real-time, your spoken words transform into written text on the screen with an eerie accuracy. Have you ever wondered how your favorite social media platform knows exactly what content will keep you scrolling for hours? 

Welcome to the era of Artificial Intelligence (AI), where the invisible hand of technology is reshaping the way we live, work, and interact with the world around us. As we stand at the crossroads of innovation and discovery, the profound impact of AI is becoming increasingly undeniable. 

In this blog, we embark on a journey to unravel the mysteries of Artificial Intelligence (AI), Machine Learning (ML), and DL (Deep Learning) where they not only keep pace with the present but, set the rhythm for the future. 

Demystifying the trio – AI, ML, and DL 

The terms Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL) are often intertwined. 

At a very high level, DL is a subset of ML, which in turn is a subset of AI. 

AI is any program that can sense, reason, act, and adapt. It is essentially a machine taking any form of intelligent behavior.  

ML is a subset of that, which can replicate intelligent behavior, but the machine continues to learn as more data is exposed to it.  

And then finally, DL is a subset of machine learning. Meaning, that it will also improve as it is exposed to more data, but now specifically to those algorithms which have multi-layered neural networks.  

Deep Dive into ML 

Machine Learning is the study and construction of programs that are not explicitly programmed by humans, but rather learn patterns as they’re exposed to more data over time.  

For instance, if we’re trying to decide whether emails are spam or not, we will start with a dataset with a bunch of emails that are going to be labeled spam versus not spam. These emails will be preprocessed and fed through a Machine Learning algorithm that learns the patterns for spam versus not spam, and the more emails it goes through, the better the model will get. Once the machine algorithm is trained, we can then use the model to predict spam versus not spam. 

Types of ML 

In general, there are two types of Machine Learning: Supervised Learning and Unsupervised Learning.  

For supervised learning, we will have a target column or labels, and, for unsupervised learning, we will not have a target column or labels.  

The goal of supervised learning is to predict that label. An example of supervised learning is fraud detection. We can define our features to be transaction time, transaction amounts, transaction location, and category of purchase. After combining all these features, we should be able to predict the future for a given transaction time, transaction amount, and category of purchase, whether there’s unusual activity, and whether this transaction is fraudulent or not.  

In unsupervised learning, the goal is to find an underlying structure of the dataset without any labels. An example would be customer segmentation for a marketing campaign. For this, we may have e-commerce data and we would want to separate the customers into groups to target them accordingly. In unsupervised learning, there’s no right or wrong answer.  

Machine Learning Workflow  

The machine learning workflow consists of:  

  • Problem statement 
  • Data collection 
  • Data exploration and preprocessing 
  • Modeling and fine-tuning 
  • Validation 
  • Decision Making and Deployment 

So, our first step is the problem statement. What problem are we trying to solve? For example, we want to see different breeds of dogs. This can be done by image recognition.  

The second step is data collection. What data do we need to solve the problem? For example, to classify different dog breeds, we would need not only a single picture of each breed but also, tons of pictures in different lighting, and different angles that are all correctly labeled.  

The next step is data exploration and preprocessing. This is when we clean our data as much as possible so that our model can predict accurately. This includes a deep dive into our data, a look at the distribution counts, and heat maps of the densest points regarding our pixels, after which we reach the next step, modeling. This means building a model to solve our problem. We start with some basic baseline models that we’re going to validate. Did it solve the problem? We validate that by having a set of pictures that we haven’t trained our model on and see how well the model can classify those images, given the labels that we have.  

Then comes decision-making and deployment. So, if we did a good job of getting a certain range of accuracy, we would move forward and put this in a higher environment (that includes Staging and Production) after communicating with the required stakeholders. 

Deep Dive into Deep Learning (DL) 

Defining features in an image, on the other hand, is a much more difficult task and has been a limitation of Traditional Machine Learning techniques. Deep Learning, however, has done a good job of addressing this. 

So, suppose we want to determine if an image is a cat or a dog, what features should we use? For images, the data is taken as numerical data to reference the coloring of each pixel within our image. A pixel could then be used as a feature. However, even a small image will have 256 by 256 pixels, which comes out to be over 65,000 pixels. 65,000 pixels means 65,000 features which is a huge number of features to be working with.  

Another issue is that using each pixel as an individual means losing the spatial relationship to the pixels around it. In other words, the information of a pixel makes sense relative to its surrounding pixels. For instance, you have different pixels that make up the nose, and different pixels that make up the eyes, separating that according to where they are on the face is quite a challenging task. This is where Deep Learning comes into the picture. Deep Learning techniques allow the features to learn on their own and combine the pixels to define these spatial relationships. 

Deep Learning is Machine Learning that involves using very complicated models called deep neural networks. Deep Learning is cutting edge and is where most of the Machine Learning research is focused on at present. It has shown exceptional performance compared to other algorithms while dealing with large datasets.  

However, it is important to note that with smaller datasets, standard Machine Learning algorithms often perform significantly better than Deep Learning algorithms. Also, if the data changes a lot over time and there isn’t a steady dataset, in that case, Machine Learning will probably do a better job in terms of performance over time.  

Types of Libraries used for AI models: 

We can use the following Python libraries:  

  • Numpy for numerical analysis 
  • Pandas for reading the data into Pandas DataFrames 
  • Matplotlib and Seaborn for visualization 
  • Scikit-Learn for machine learning  
  • TensorFlow and Keras for deep learning specifically 

 How is AI creating an impact for us today? Is this era of AI different? 

The two spaces where we see drastic growth and innovation today are computer vision and natural language processing 

The sharp advancements in computer vision are impacting multiple areas. Some of the most notable advancements are in the automobile industry where cars can drive themselves. In healthcare, computer vision is now used to review different imaging modalities, such as X-rays and MRIs to diagnose illnesses. We’re fast approaching the point where machines are doing as well, if not better than the medical experts.  

Similarly, natural language processing is booming with vast improvements in its ability to translate words into texts, determine sentiment, cluster new articles, write papers, and much more.  

Factors that have contributed to the current state of Machine Learning are:  

  • Bigger data sets 
  • Faster computers  
  • Open-source packages 
  • Wide range of neural network architectures

We now have larger and more diverse datasets than ever before. With the Cloud infrastructure now in place to store copious amounts of data for much cheaper, getting access to powerful hardware for processing and storing data, we now have larger, finer datasets to learn underlying patterns across a multitude of fields. All this is leading to cutting-edge results in a variety of fields.  

For instance, our phones can recognize our faces and our voices, they can look at pictures and identify pictures of us and our friends. We have stores where we can walk in and pick things up such as Amazon Go and not have to go to a checkout counter. We have our homes being powered by our voices telling smart machines to play music or switch the lights on/off.  

All of this has been driven by the current era of artificial intelligence. AI is now used to aid in medical imaging. For drug discovery, a great example is Pfizer which is using IBM Watson to leverage machine learning to power its drug discovery and search for immuno-oncology drugs. Patient care is being driven by AI. AI research within the healthcare industry has helped advance sensory aids for the deaf, blind, and those who have lost limbs. 

How Fresh Gravity can help? 

Fresh Gravity has rich experience and expertise in Artificial Intelligence. Our AI offerings include Machine Learning, Deep Learning Solutions, Natural Language Processing (NLP) Services, Generative AI Solutions, and more. To learn more about how we can help elevate your data journey through AI, please write to us at info@freshgravity.com or you can directly reach out to me at debayan.Ghosh@freshgravity.com. 

Please follow us at Fresh Gravity for more insightful blogs. 

 

The post A Deep Dive into the Realm of AI, ML, and DL appeared first on Fresh Gravity.

]]>
https://www.freshgravity.com/insights-blogs/a-deep-dive-into-ai-ml-and-dl/feed/ 0
Exploring the AI Frontier in Data Management for Data Professionals https://www.freshgravity.com/insights-blogs/for-data-professionals/ https://www.freshgravity.com/insights-blogs/for-data-professionals/#respond Mon, 09 Oct 2023 08:27:47 +0000 https://www.freshgravity.com/?p=1533 Written By Sudarsana Roy Choudhury, Managing Director, Data Management The beginning of AI is shrouded with myths, stories, and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen. One of the first formal beginnings of AI research was at a workshop held on the campus of Dartmouth College, USA during the summer […]

The post Exploring the AI Frontier in Data Management for Data Professionals appeared first on Fresh Gravity.

]]>
Written By Sudarsana Roy Choudhury, Managing Director, Data Management

The beginning of AI is shrouded with myths, stories, and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen. One of the first formal beginnings of AI research was at a workshop held on the campus of Dartmouth College, USA during the summer of 1956. This was followed by an AI winter around 1974 when the US government withdrew all funding for AI research. This changed eventually when the Japanese government showcased major progress and heavily funded this field. The boom that we see today started in the first decade of the 21st century and of course, we are now at a point where AI impacts all areas of our lives and jobs. 

AI has been a hot topic for many decades now but its relevance for all data professionals is more today than it has ever been in the past. Recently, I had the opportunity to moderate a panel discussion hosted by Fresh Gravity on “Exploring the AI Frontier in Data Management for Data Professionals’. The panelists for the discussion, a group of talented data professionals with vast knowledge and in-depth experience, were Ayush Mittal (Manager, Data Science & Analytics – Fresh Gravity), Siddharth Mohanty (Sr Manager, Data Management – Fresh Gravity), Soumen Chakraborty (Director, Data Management – Fresh Gravity), and Vaibhav Sathe (Director, Data Management – Fresh Gravity). 

It was an opportunity to provide some thoughts and pointers on what we, as data professionals, should gear up on to be able to leverage various opportunities that AI-driven tools are providing and are expected to provide to enhance the value proposition we offer to our clients, help us perform our work smarter, and spend more time and effort on the right areas, instead of laboring over activities that can be done quicker and better by leveraging AI offerings. 

To summarize I would like to list some key take aways from this insightful session – 

  • We all are experiencing the impact of AI in our everyday lives. The ability to harness and understand the nuances and be able to utilize the options (like personalized product suggestion in retail websites) can make our lives simpler, without allowing AI to control us
  • Cybersecurity is a key concern for all of us. AI can detect and analyse complex patterns of malicious activity and quickly detect and respond to security threats
  • Optimizing the use of AI can be a huge differentiator in delivering solutions for clients – where some of the tools that can be leveraged are StackOverflow, CoPilots, Google and AI driven data modelling tools
  • For Data Management, with the huge volume and variety of data an organization has to deal with, the shift has already started. By using more AI-driven tools and services, organizations can ensure quicker insights, transformations, and movement of data across. This trend will only accelerate going forward
  • AI will have a direct impact on improving the end user outcomes with speed of delivery and quality of data insights and predictions. What we see now is just the beginning of the huge shift in paradigm in the way value is delivered for the end user
  • Establishing ethical usage of data and implementing Data Governance around data usage is key to AI success
  • Everyone need not understand the code behind AI algorithms but should understand its core purpose and operational methodology. Truly harnessing the power of any AI system hinges on a blend of foundational knowledge and intuitive reasoning, ensuring its effective and optimal use
  • Some upskilling and curiosity to learn are essential for each role (like Business Analysts, Quality Assurance Engineers, Data Analysts, etc.) to be able to take advantage of the AI-driven tools that are flooding the market and will continue to evolve
  • While some of us may dream of getting back to an AI-less world, some are embracing the new AI-enabled world with glee! The reality is that AI is here to stay, and the way we approach and adapt to this revolution will determine whether we can benefit while staying within the boundaries of ethical limits

Fresh Gravity has rich experience and expertise in Artificial Intelligence. Our AI offerings include Deep Learning Solutions, Natural Language Processing (NLP) Services, Generative AI Solutions, and more. To learn more about how we can help elevate your data journey through AI, please write to us at info@freshgravity.com or you can directly reach out to me at Sudarsana.roychoudhury@freshgravity.com. 

The post Exploring the AI Frontier in Data Management for Data Professionals appeared first on Fresh Gravity.

]]>
https://www.freshgravity.com/insights-blogs/for-data-professionals/feed/ 0