Sitemap

A list of all the posts and pages found on the site. For you robots out there, there is an XML version available for digesting as well.

Pages

Posts

【PIN】Bugs

less than 1 minute read

Published:

I found that the online images on the blog cannot be displayed properly on the website. I uploaded the PDF version of my blog in the ./Posts_PDF folder for everyone to better read the text and images. 我发现博客的在线图片在网站中无法正常显示。我在./Posts_PDF文件夹下上传了博客的PDF版,方便大家更好的阅读图文。

Machine Learning Review

less than 1 minute read

Published:

记录台大李宏毅的机器学习课程学习记录,来学点ML的底层。 相关笔记都在项目仓库的PDF里(./Posts_PDF),没有单独做博客markdown文档。

DS-String

2 minute read

Published:

     今天是数据结构与算法的串。

CO-Representation and Computation of Data

less than 1 minute read

Published:

    今天是第二章——数据的表示和运算。本章探讨数据如何在计算机中表示,运算器如何实现数据的算数、逻辑运算

DS-Stack and Queue

4 minute read

Published:

     接着复习408基础知识,今天是数据结构与算法的栈和队列。栈和队列都是一种操作首先的线性表。

CO-Introduction

less than 1 minute read

Published:

    同步进行学习计组,今天是第一章——引言。

DS-Linear List

1 minute read

Published:

     接着复习408基础知识,今天是数据结构与算法的线性表。

DS-Introduction

less than 1 minute read

Published:

     从今天开始复习一遍408基础知识,并进行相关的拓展。今天是数据结构与算法的引言。一些基本概念。

VMamba

4 minute read

Published:

        今天是我做了这么久基于Mamba视觉的核心模块——VMamba

Mamba

2 minute read

Published:

        终于铺垫了这么多,可以总结一下做了这么久工作的核心网络——Mamba。尽管Mamba被某些学者认为是MambaOUT,但我认为就Mamba在医学影像处理领域的作用来看,处理医学高分辨率图像时能够取得和Transformer相似的效果的同时,保持线性的时间复杂度。特别在分辨复杂医学结构时,Mamba通过上下文提取全局特征,通常表现出比CNN更加优秀的结构理解能力。

The State Space Model (SSM) - 2

less than 1 minute read

Published:

       终于生活都回归正轨了,准备要开学了,接着上次的SSM接着补充。

The State Space Model (SSM) - 1

less than 1 minute read

Published:

       为了给Mamba的介绍做铺垫。过年期间没办法静下心学知识点,只能刷题。今天查漏一个点——状态空间模型(SSM)。

hot100 (80 / 100)

54 minute read

Published:

        2025-1-24开始刷hot100,计划2025-03-15前刷完。最近一次更新是2025-03-11。暂时到这里截止了 (86 / 100) ,开别的篇章剩下的难题慢慢写了,不一天到晚死磕了。

ViT

5 minute read

Published:

        复习Transformer的同时顺便延申一下ViT。ViT原论文中最核心的结论是,当拥有足够多的数据进行预训练的时候,ViT的表现就会超过CNN,突破transformer缺少归纳偏置的限制,可以在下游任务中获得较好的迁移效果

Transformer-2

less than 1 minute read

Published:

        回顾和复习常见神经网络结构的底层原理——Transformer。21号没整理总结完,22号今天结束。

Transformer-1

less than 1 minute read

Published:

        回顾和复习常见神经网络结构的底层原理——Transformer。之前的科研项目没有涉及到Transformer的创新,没有细致的了解,今天查漏补缺一下。

RNN

less than 1 minute read

Published:

        回顾和复习常见神经网络结构的底层原理————RNN(循环神经网络),这个只是听说过,没有具体的项目实践,在这里扫盲总结一下~

CNN

less than 1 minute read

Published:

回顾和复习常见神经网络结构的底层原理————CNN(卷积神经网络)

FNN

less than 1 minute read

Published:

回顾和复习常见神经网络结构的底层原理————FNN(卷积神经网络)

portfolio

publications

CFM-UNet: coupling local and global feature extraction networks for medical image segmentation

Published in Scientific Reports, 2025

CFM-UNet proposed integrates CNN-based Bottle2neck blocks for local feature extraction and Mamba-based visual state space blocks for global feature extraction. These parallel frameworks perform feature fusion through our designed SEF block, achieving complementary advantages

Recommended citation: Niu, K., Han, J. & Cai, J. CFM-UNet: coupling local and global feature extraction networks for medical image segmentation. Sci Rep 15, 22236 (2025). https://doi.org/10.1038/s41598-025-92010-y
Download Paper

talks

teaching