[go: up one dir, main page]

Skip to content

Densely Connected Transformer with Linear Self-Attention for Lightweight Image Super-Resolution

Notifications You must be signed in to change notification settings

zengkun301/DCTLSA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 

Repository files navigation

DCTLSA

Densely Connected Transformer with Linear Self-Attention for Lightweight Image Super-Resolution

This is an official implementation of Densely Connected Transformer with Linear Self-Attention for Lightweight Image Super-Resolution

If this repo works for you, please cite our paper:

@ARTICLE{zeng2023densely,
  author={Zeng, Kun and Lin, Hanjiang and Yan, Zhiqiang and Fang, Jinsheng},
  journal={IEEE Transactions on Instrumentation and Measurement}, 
  title={Densely Connected Transformer With Linear Self-Attention for Lightweight Image Super-Resolution}, 
  year={2023},
  volume={72},
  pages={1-12},
  doi={10.1109/TIM.2023.3304672}
}

How To Test

sh demo.sh

The testing results will be saved in the ./experiments folder.

License

This project is released under the Apache 2.0 license.

Acknowledgements

This code is built on EDSR-PyTorch. We thank the authors for sharing their codes.

About

Densely Connected Transformer with Linear Self-Attention for Lightweight Image Super-Resolution

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published