BERT stands for Bidirectional Encoder Representations from Transformers. Google rolling out a new algorithmic update known as BERT which is highly focused on increasing human-like understandability and it is based neural network-based technique.
This may take a week to fully roll out globally and it can change search engine result pages because with this update Google can understand a wide variety of complex words and co-relation between these words.
For example, the word “bank” would have the same context-free representation in “bank balance” and “bank of the river.” Contextual models instead generate a new representation of every word that is based on the other co-related words that appear in the sentence. Like in the sentence “I check my bank balance,” a unidirectional contextual model would represent “bank” based on “I check my” but not “balance.” However, BERT represents “bank” using both its previous and next context — “I check the … balance” — starting from the very bottom of a deep neural network, making it deeply bidirectional.
Today’s world needs this kind of technique to make searching more valuable and to make search efficient for every single query. A featured snippet is more likely to affected by this update because this Google update is universal and it is also applied to other languages as well.
For an SEO point of view, you don’t need to fix anything for this update as you don’t optimize for RankBrain the same as this happened in the case of BERT. The only way to get better visibility on SERPs if your content is useful and relevant for the end-user and until it satisfies the search intent.