Practice Free AIF-C01 Exam Online Questions
What are tokens in the context of generative AI models?
- A . Tokens are the basic units of input and output that a generative AI model operates on, representing words, subwords, or other linguistic units.
- B . Tokens are the mathematical representations of words or concepts used in generative AI models.
- C . Tokens are the pre-trained weights of a generative AI model that are fine-tuned for specific tasks.
- D . Tokens are the specific prompts or instructions given to a generative AI model to generate output.
A company uses Amazon SageMaker for its ML pipeline in a production environment. The company
has large input data sizes up to 1 GB and processing times up to 1 hour. The company needs near real-time latency.
Which SageMaker inference option meets these requirements?
- A . Real-time inference
- B . Serverless inference
- C . Asynchronous inference
- D . Batch transform
An AI practitioner is using a large language model (LLM) to create content for marketing campaigns.
The generated content sounds plausible and factual but is incorrect.
Which problem is the LLM having?
- A . Data leakage
- B . Hallucination
- C . Overfitting
- D . Underfitting
A digital devices company wants to predict customer demand for memory hardware. The company does not have coding experience or knowledge of ML algorithms and needs to develop a data-driven predictive model. The company needs to perform analysis on internal data and external data.
Which solution will meet these requirements?
- A . Store the data in Amazon S3. Create ML models and demand forecast predictions by using Amazon SageMaker built-in algorithms that use the data from Amazon S3.
- B . Import the data into Amazon SageMaker Data Wrangler. Create ML models and demand forecast predictions by using SageMaker built-in algorithms.
- C . Import the data into Amazon SageMaker Data Wrangler. Build ML models and demand forecast predictions by using an Amazon Personalize Trending-Now recipe.
- D . Import the data into Amazon SageMaker Canvas. Build ML models and demand forecast predictions by selecting the values in the data from SageMaker Canvas.
A company has a database of petabytes of unstructured data from internal sources. The company wants to transform this data into a structured format so that its data scientists can perform machine learning (ML) tasks.
Which service will meet these requirements?
- A . Amazon Lex
- B . Amazon Rekognition
- C . Amazon Kinesis Data Streams
- D . AWS Glue
A social media company wants to use a large language model (LLM) for content moderation. The company wants to evaluate the LLM outputs for bias and potential discrimination against specific groups or individuals.
Which data source should the company use to evaluate the LLM outputs with the LEAST administrative effort?
- A . User-generated content
- B . Moderation logs
- C . Content moderation guidelines
- D . Benchmark datasets
An AI practitioner trained a custom model on Amazon Bedrock by using a training dataset that contains confidential data. The AI practitioner wants to ensure that the custom model does not generate inference responses based on confidential data.
How should the AI practitioner prevent responses based on confidential data?
- A . Delete the custom model. Remove the confidential data from the training dataset. Retrain the custom model.
- B . Mask the confidential data in the inference responses by using dynamic data masking.
- C . Encrypt the confidential data in the inference responses by using Amazon SageMaker.
- D . Encrypt the confidential data in the custom model by using AWS Key Management Service (AWS KMS).