skip to Main Content

I am learning Azure Cognitive Search and got a bit confused about Analizer and Normilizer.
https://learn.microsoft.com/en-us/azure/search/search-analyzers
https://learn.microsoft.com/en-us/azure/search/search-normalizers

As far as I understood the only difference is the fact that Analyzers perform tockenization.

Could someone provide good example whene I should use one over antoher?

  • What is benefits of analizer over normalizer and vise versa ?
  • What is more efficitent permance wise?

Thank you for your time!

2

Answers


  1. The simplest explanation is to use an analyzer for properties containing blocks of text. The normalizer is more suitable for properties with short content that you typically would use for filtering or sorting like City, Country, Name, etc.

    A block of text will have content in a specific language. A language-specific analyzer will do a better job of producing good tokens for internal use by the search engine. You will find that you get better recall for textual content that is correctly processed using a relevant analyzer.

    Login or Signup to reply.
  2. The values that you pass in a filter, sort, or facet can’t be analyzed, so "normalizers" were created to fill that gap. They don’t do everything that an analyzer can do (i.e., there is a smaller set of allowed tokenizers, token filters, and character filters) but they take care of the bigger issues, like normalizing text casing and getting rid of punctuation.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search