博客
关于我
LeetCode Most Common Word 最常见的词
阅读量:803 次
发布时间:2023-01-31

本文共 3035 字,大约阅读时间需要 10 分钟。

Here is an optimized version of the thought process and solution:

  • Data Preparation

    • Convert the entire paragraph to lowercase to handle case insensitivity.
    • Remove all punctuation marks (such as commas, periods, exclamation points, etc.) to isolate words.
    • Ensure words are properly separated by spaces to avoid partial words (e.g., "ball," becomes "ball").
  • Word Frequency Calculation

    • Traverse the prepared string, extracting each word by ignoring punctuation and case differences.
    • Use a hash map (dictionary) to count occurrences of each word.
    • For each character in the paragraph: If it's a letter, add it to the current word being built. If it's not a letter or reaches the end of the string, finalize the word and update its count in the hash map.
  • Filter Banned Words

    • Store banned words in a set for quick lookup.
    • Iterate through the hash map to exclude any words that exist in the banned set, keeping only valid words.
  • Determine Most Frequent Word

    • Sort the remaining words by their frequency in descending order.
    • Return the first word in this sorted list, as it by definition is unique and has the highest count according to the problem constraints.
  • Final Solution Code

    import java.util.HashMap;import java.util.HashSet;import java.util.Map;public class Solution {    public String mostCommonWord(String paragraph, String[] banned) {        // Convert paragraph to lowercase and remove punctuation        StringBuilder cleanParagraph = new StringBuilder();        for (char c : paragraph.toCharArray()) {            if (c >= 'a' && c <= 'z') {                cleanParagraph.append(c);            }        }        // Split into words        String[] words = cleanParagraph.toString().split(" +");        // Count frequency of each word        Map
    frequencyMap = new HashMap<>(); for (String word : words) { frequencyMap.put(word, frequencyMap.getOrDefault(word, 0) + 1); } // Create banned words set for quick lookup HashSet
    bannedWords = new HashSet<>(); for (String bw : banned) { bannedWords.add(bw.toLowerCase()); } // Exclude banned words and find the most frequent int maxCount = -1; String result = ""; for (Map.Entry
    entry : frequencyMap.entrySet()) { if (!bannedWords.contains(entry.getKey())) { if (entry.getValue() > maxCount) { maxCount = entry.getValue(); result = entry.getKey(); } } } return result; }}

    Explanation

    • The code first processes the input paragraph to remove punctuation and convert it to lowercase, ensuring uniformity in word processing.
    • It then splits the cleaned string into individual words and uses a hash map to count each word's occurrences.
    • Banned words are stored in a set for quick exclusion.
    • Finally, the code iterates through the frequency map, excluding banned words, and identifies the word with the highest count, which is then returned as the result.

    转载地址:http://oogyk.baihongyu.com/

    你可能感兴趣的文章
    OpenCV与AI深度学习 | YOLOv11来了:将重新定义AI的可能性
    查看>>
    OpenCV与AI深度学习 | YOLOv8重磅升级,新增旋转目标检测,又该学习了!
    查看>>
    OpenCV与AI深度学习 | 使用OpenCV轮廓检测提取图像前景
    查看>>
    OpenCV与AI深度学习 | 使用Python和OpenCV实现火焰检测(附源码)
    查看>>
    OpenCV与AI深度学习 | 使用PyTorch进行小样本学习的图像分类
    查看>>
    OpenCV与AI深度学习 | 使用YOLO11实现区域内目标跟踪
    查看>>
    OpenCV与AI深度学习 | 使用YOLOv8做目标检测、实例分割和图像分类(包含实例操作代码)
    查看>>
    OpenCV与AI深度学习 | 使用单相机对已知物体进行3D位置估计
    查看>>
    OpenCV与AI深度学习 | 初学者指南 -- 什么是迁移学习?
    查看>>
    OpenCV与AI深度学习 | 十分钟掌握Pytorch搭建神经网络的流程
    查看>>
    OpenCV与AI深度学习 | 基于GAN的零缺陷样本产品表面缺陷检测
    查看>>
    OpenCV与AI深度学习 | 基于OpenCV和深度学习预测年龄和性别
    查看>>
    OpenCV与AI深度学习 | 基于Python和OpenCV将图像转为ASCII艺术效果
    查看>>
    OpenCV与AI深度学习 | 基于PyTorch实现Faster RCNN目标检测
    查看>>
    OpenCV与AI深度学习 | 基于PyTorch语义分割实现洪水识别(数据集 + 源码)
    查看>>
    OpenCV与AI深度学习 | 基于YOLO11的车体部件检测与分割
    查看>>
    OpenCV与AI深度学习 | 基于YOLOv8 + BotSORT实现球员和足球检测与跟踪 (步骤 + 源码)
    查看>>
    OpenCV与AI深度学习 | 基于YOLOv8的停车对齐检测
    查看>>
    OpenCV与AI深度学习 | 基于机器视觉的磁瓦表面缺陷检测方案
    查看>>
    OpenCV与AI深度学习 | 基于深度学习的轮胎缺陷检测系统
    查看>>