Practice Free H13-311_V3.5 Exam Online Questions
Which of the following statements is false about gradient descent algorithms?
- A . Each time the global gradient updates its weight, all training samples need to be calculated.
- B . When GPUs are used for parallel computing, the mini-batch gradient descent (MBGD) takes less time than the stochastic gradient descent (SGD) to complete an epoch.
- C . The global gradient descent is relatively stable, which helps the model converge to the global extremum.
- D . When there are too many samples and GPUs are not used for parallel computing, the convergence process of the global gradient algorithm is time-consuming.
B
Explanation:
The statement that mini-batch gradient descent (MBGD) takes less time than stochastic gradient descent (SGD) to complete an epoch when GPUs are used for parallel computing is incorrect.
Here’s why:
Stochastic Gradient Descent (SGD) updates the weights after each training sample, which can lead to faster updates but more noise in the gradient steps. It completes an epoch after processing all samples one by one.
Mini-batch Gradient Descent (MBGD) processes small batches of data at a time, updating the weights after each batch. While MBGD leverages the computational power of GPUs effectively for parallelization, the comparison made in this question is not about overall computation speed, but about completing an epoch.
MBGD does not necessarily complete an epoch faster than SGD, as MBGD processes multiple samples in each batch, meaning fewer updates per epoch compared to SGD, where weights are updated after every individual sample.
Therefore, the correct answer is
B. FALSE, as MBGD does not always take less time than SGD for completing an epoch, even when GPUs are used for parallelization. HCIA AI
Reference: AI Development Framework: Discussion of gradient descent algorithms and their efficiency on different hardware architectures like GPUs.
Which of the following items are included in the results returned when the face search service is successfully called?
- A . Searched face similarity
- B . Searched faces id
- C . Searched face position
- D . Searched face number
In Huawei Cloud El Enterprise Intelligence, which basic platform services are included? (Multiple Choice)
- A . Machine learning
- B . Deep learning
- C . Graph engine
- D . Batch processing
Which is not a deep learning algorithm?
- A . Setr-encoder
- B . Convolutional neural networks
- C . Recurrent neural networks
- D . Support vector machine
The global gradient descent, stochastic gradient descent, and batch gradient descent algorithms are gradient descent algorithms.
Which of the following is true about these algorithms?
- A . The batch gradient algorithm can solve the problem of local minimum value.
- B . The global gradient algorithm can find the minimum value of the loss function.
- C . The stochastic gradient algorithm can find the minimum value of the loss function.
- D . The convergence process of the global gradient algorithm is time-consuming.
D
Explanation:
The global gradient descent algorithm evaluates the gradient over the entire dataset before each update, leading to accurate but slow convergence, especially for large datasets. In contrast, stochastic gradient descent updates the model parameters more frequently, which allows for faster convergence but with noisier updates. While batch gradient descent updates the parameters based on smaller batches of data, none of these algorithms can fully guarantee finding the global minimum in non-convex problems, where local minima may exist.
Reference: Huawei HCIA-AI Certification, Machine Learning Algorithms C Gradient Descent Methods.
The activation function plays an important role in the neural network model learning and understanding of very complex problems. The following statement about the activation function is correct.
- A . Activation functions are linear functions
- B . Activation functions are non-linear functions
- C . The activation function is partly a nonlinear function, partly a linear function
- D . Most of the activation functions are nonlinear functions, and a few are linear functions
What are the Python language data types? (Multiple Choice)
- A . numbers
- B . string
- C . list
- D . tuple
- E . dictionary
ModelArts Service and (). The combination of services can easily deploy the model to "end"?
- A . OBS
- B . OCR
- C . ECS
- D . HiLens
Add to the loss function of linear regression L1 Regular term, this time the regression is called Lasso return.
- A . TRUE
- B . FALSE
Tensorflow is the second generation of artificial intelligence learning system developed by Google based on ( ).
- A . DistBelief
- B . PaleyFunction
- C . ConvexOne
- D . Infinity