Saturday , July 27 2024
Breaking News

UPSC Fever Free Certificate | Machine learning Deep Learning Quiz Answers

UPSC Fever Free Certificate | Machine learning Deep Learning Quiz Answers

Hello everyone! Here’s a great opportunity for all the students to grab a free online certification from UPSC Fever. The Certificate answers are available in this post related to machine learning and deep learning. Check them Out. All the best!

About the Quiz:

  • The quiz is conducted online in which participants have only once attempt.
  • Total 20 MCQs has been Given and E-Certificate will be issued only to those who secure a minimum of 50% marks.
  • Enter Your details carefully as some URL be rejected in the e-certificate.
  • The registration is free.
  • E-Certificate will be provided to the registered participants and above in the quiz

Here is Deep Learning Quiz Answers:

  • True
  • False
  • c.shape = (2, 3)
  • c.shape = (3, 2)
  • ReLU
  • Leaky ReLU
  • sigmoid
  • True
  • False
  • size of the hidden layer
  • activation values
  • weight matrices
  • True
  • False
  • x = img.reshape((3,3232))
  • x = img.reshape((3232*3,1))
  • True
  • False
  • A neuron computes a function g that scales the input x linearly (Wx + b)
  • A neuron computes a linear function (z = Wx + b) followed by an activation function
  • A neuron computes the mean of all features before applying the output to an activation function
  • True
  • False
  • It doesn’t matter. So long as you initialize the weights randomly gradient descent is not affected by whether the weights are large or small.
  • This will cause the inputs of the tanh to also be very large, thus causing gradients to be close to zero. The optimization algorithm will thus become slow.
  • This will cause the inputs of the tanh to also be very large, causing the units to be “highly activated” and thus speed up learning compared to if the weights had to start from small values.
  • This will cause the inputs of the tanh to also be very large, thus causing gradients to also become large. You therefore have to set \alphaα to be very small to prevent divergence; this will slow down learning.
  • The deeper layers of a neural network are typically computing more complex features of the input than the earlier layers.
  • The earlier layers of a neural network are typically computing more complex features of the input than the deeper layers.
  • c.shape = (4, 3)
  • The computation cannot happen because the sizes don’t match. It’s going to be “Error”!
  • True
  • False
  • We have access to a lot more data.
  • Neural Networks are a brand new field.
  • We have access to a lot more computational power.
  • Deep learning has resulted in significant improvements in important applications such as online advertising, speech recognition, and image recognition.
  • This will invoke broadcasting, so b is copied three times to become (3,3), and *∗ is an element-wise product so c.shape will be (3, 3)
  • This will invoke broadcasting, so b is copied three times to become (3, 3), and *∗ invokes a matrix multiplication operation of two 3×3 matrices so c.shape will be (3, 3)
  • Each neuron in the first hidden layer will perform the same computation. So even after multiple iterations of gradient descent each neuron in the layer will be computing the same thing as other neurons.
  • Each neuron in the first hidden layer will perform the same computation in the first iteration. But after one iteration of gradient descent they will learn to compute different things because we have “broken symmetry”.
  • True
  • False
  • c.shape = (150,150)
  • c.shape = (12288, 45)
  • True
  • False
  • It can be trained as a supervised learning problem.
  • It is strictly more powerful than a Convolutional Neural Network (CNN).
  • AI runs on computers and is thus powered by electricity, but it is letting computers do things not possible before.
  • AI is powering personal devices in our homes and offices, similar to electricity.
  • Similar to electricity starting about 100 years ago, AI is transforming multiple industries.
  • Through the “smart grid”, AI is delivering a new wave of electricity.

Here is Machine Learning Quiz Answers:

  • 500 randomly chosen images
  • 10,000 randomly chosen images
  • 10,000 images on which the algorithm made a mistake
  • 500 images on which the algorithm made a mistake
  • Reduce the number of features
  • Increase the number of features
  • Regularization
  • No Regularization
  • Treat both as classification problems.
  • Treat problem 1 as a classification problem, problem 2 as a regression problem.
  • Treat problem 1 as a regression problem, problem 2 as a classification problem.
  • Treat both as regression problems.
  • Trying smaller sets of features
  • Getting more training examples
  • Adding features
  • Adding polynomial features
  • Decreasing regularization parameter
  • Increasing regularization parameter
  • Trying smaller sets of features
  • Getting more training examples
  • Adding features
  • Adding polynomial features
  • Decreasing regularization parameter
  • Increasing regularization parameter
  • Train – 3,333,334; Dev – 3,333,333 ; Test – 3,333,333
  • Train – 9,500,000; Dev – 250,000 ; Test – 250,000
  • Underfitting
  • Overfitting
  • Correlating features
  • Covariance
  • Low bias
  • High bias
  • bias enhancing
  • bias reducing
  • variance enchancing
  • variance reducing
  • Classify emails as spam or not spam.
  • Watching you label emails as spam or not spam.
  • The number (or fraction) of emails correctly classified as spam/not spam.
  • None of the above, this is not a machine learning algorithm
  • Neural networks
  • Recurrent network
  • Convolutional Networks
  • Machine Learning
  • Dynamic Programming
  • landmark error
  • Goldilocks zone
  • Bayes error
  • Distribution error
  • Bernoullis error
  • Yes, because having 4.0% training error shows you have high bias.
  • Yes, because this shows your bias is higher than your variance.
  • No, because this shows your variance is higher than your bias.
  • No, because there is insufficient information to tell.
  • Feature scaling
  • Mean normalization
  • Correlating features
  • Covariance
  • Feature scaling
  • Mean normalization
  • Correlating features
  • Covariance
  • Classification
  • Regression
  • Clustering
  • Data mining
  • Exploratory data analysis
  • use case analysis
  • waterfall model
  • orthogonality
  • optimization
  • Gradient boosting
  • Gradient checking
  • Regularization
  • Gradient descent
  • Transfer learning
  • Fine tuning
  • Weight shifting
  • Gradient boosting
  • End-to-end deep learning
  • Data synthesis
  • Multi-task learning
  • Transfer learning
  • Fine tuning
  • End-to-end deep learning
  • Weight shifting
  • Gradient boosting
  • Data synthesis
  • Multi-task learning
  • bias enhancing
  • bias reducing
  • variance enchancing
  • variance reducing
  • Given email labeled as spam/not spam, learn a spam filter.
  • Given a set of news articles found on the web, group them into set of articles about the same story
  • Given a database of customer data, automatically discover market segments and group customers into different market segments
  • Given a dataset of patients diagnosed as either having diabetes or not, learn to classify new patients as having diabetes or not
  • cause underfitting
  • cause overfitting
  • no effect
  • Fit training set well in cost function
  • Use of a bigger development set
  • Fit development set well in cost function
  • Regularization or using bigger training set
  • Fit test set well on cost function
  • true
  • False
  • Transfer learning
  • Fine tuning
  • End-to-end deep learning
  • Weight shifting
  • Gradient boosting
  • Data synthesis
  • Multi-task learning

About Clear My Certification

Check Also

Infosys Springboard Fundamentals of Information Security Answers

Apply for Fundamentals of Information Security Here Q1 of 15 How many keys are required …

Leave a Reply

Your email address will not be published. Required fields are marked *