Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
//
// Regular Expression for URL validation
//
// Author: Diego Perini
// Created: 2010/12/05
// Updated: 2018/09/12
// License: MIT
//
// Copyright (c) 2010-2018 Diego Perini (http://www.iport.it)
//
// Permission is hereby granted, free of charge, to any person
// obtaining a copy of this software and associated documentation
// files (the "Software"), to deal in the Software without
// restriction, including without limitation the rights to use,
// copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the
// Software is furnished to do so, subject to the following
// conditions:
//
// The above copyright notice and this permission notice shall be
// included in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
// OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
// NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
// HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
// WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
// FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
// OTHER DEALINGS IN THE SOFTWARE.
//
// the regular expression composed & commented
// could be easily tweaked for RFC compliance,
// it was expressly modified to fit & satisfy
// these test for an URL shortener:
//
// http://mathiasbynens.be/demo/url-regex
//
// Notes on possible differences from a standard/generic validation:
//
// - utf-8 char class take in consideration the full Unicode range
// - TLDs have been made mandatory so single names like "localhost" fails
// - protocols have been restricted to ftp, http and https only as requested
//
// Changes:
//
// - IP address dotted notation validation, range: 1.0.0.0 - 223.255.255.255
// first and last IP address of each class is considered invalid
// (since they are broadcast/network addresses)
//
// - Added exclusion of private, reserved and/or local networks ranges
// - Made starting path slash optional (http://example.com?foo=bar)
// - Allow a dot (.) at the end of hostnames (http://example.com.)
// - Allow an underscore (_) character in host/domain names
// - Check dot delimited parts length and total length
// - Made protocol optional, allowed short syntax //
//
// Compressed one-line versions:
//
// Javascript regex version
//
// /^(?:(?:(?:https?|ftp):)?\/\/)(?:\S+(?::\S*)?@)?(?:(?!(?:10|127)(?:\.\d{1,3}){3})(?!(?:169\.254|192\.168)(?:\.\d{1,3}){2})(?!172\.(?:1[6-9]|2\d|3[0-1])(?:\.\d{1,3}){2})(?:[1-9]\d?|1\d\d|2[01]\d|22[0-3])(?:\.(?:1?\d{1,2}|2[0-4]\d|25[0-5])){2}(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))|(?:(?:[a-z0-9\u00a1-\uffff][a-z0-9\u00a1-\uffff_-]{0,62})?[a-z0-9\u00a1-\uffff]\.)+(?:[a-z\u00a1-\uffff]{2,}\.?))(?::\d{2,5})?(?:[/?#]\S*)?$/i
//
// PHP version (uses % symbol as delimiter)
//
// %^(?:(?:(?:https?|ftp):)?\/\/)(?:\S+(?::\S*)?@)?(?:(?!(?:10|127)(?:\.\d{1,3}){3})(?!(?:169\.254|192\.168)(?:\.\d{1,3}){2})(?!172\.(?:1[6-9]|2\d|3[0-1])(?:\.\d{1,3}){2})(?:[1-9]\d?|1\d\d|2[01]\d|22[0-3])(?:\.(?:1?\d{1,2}|2[0-4]\d|25[0-5])){2}(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))|(?:(?:[a-z0-9\x{00a1}-\x{ffff}][a-z0-9\x{00a1}-\x{ffff}_-]{0,62})?[a-z0-9\x{00a1}-\x{ffff}]\.)+(?:[a-z\x{00a1}-\x{ffff}]{2,}\.?))(?::\d{2,5})?(?:[/?#]\S*)?$%iuS
//
var re_weburl = new RegExp(
"^" +
// protocol identifier (optional)
// short syntax // still required
"(?:(?:(?:https?|ftp):)?\\/\\/)" +
// user:pass BasicAuth (optional)
"(?:\\S+(?::\\S*)?@)?" +
"(?:" +
// IP address exclusion
// private & local networks
"(?!(?:10|127)(?:\\.\\d{1,3}){3})" +
"(?!(?:169\\.254|192\\.168)(?:\\.\\d{1,3}){2})" +
"(?!172\\.(?:1[6-9]|2\\d|3[0-1])(?:\\.\\d{1,3}){2})" +
// IP address dotted notation octets
// excludes loopback network 0.0.0.0
// excludes reserved space >= 224.0.0.0
// excludes network & broadcast addresses
// (first & last IP address of each class)
"(?:[1-9]\\d?|1\\d\\d|2[01]\\d|22[0-3])" +
"(?:\\.(?:1?\\d{1,2}|2[0-4]\\d|25[0-5])){2}" +
"(?:\\.(?:[1-9]\\d?|1\\d\\d|2[0-4]\\d|25[0-4]))" +
"|" +
// host & domain names, may end with dot
// can be replaced by a shortest alternative
// (?![-_])(?:[-\\w\\u00a1-\\uffff]{0,63}[^-_]\\.)+
"(?:" +
"(?:" +
"[a-z0-9\\u00a1-\\uffff]" +
"[a-z0-9\\u00a1-\\uffff_-]{0,62}" +
")?" +
"[a-z0-9\\u00a1-\\uffff]\\." +
")+" +
// TLD identifier name, may end with dot
"(?:[a-z\\u00a1-\\uffff]{2,}\\.?)" +
")" +
// port number (optional)
"(?::\\d{2,5})?" +
// resource path (optional)
"(?:[/?#]\\S*)?" +
"$", "i"
);
@breinkober
Copy link

breinkober commented Jun 10, 2020

oh okay, but how can i check if the TLD is a valid one?

e.g. check with this list: https://publicsuffix.org/list/effective_tld_names.dat
i work with js

@ddelange
Copy link

ddelange commented Jun 10, 2020

@ryankearney
Copy link

ryankearney commented Jul 28, 2020

I have added simple network ranges validation, the rules I used are:

  • valid range 1.0.0.0 - 223.255.255.255, network adresses above and including 224.0.0.0 are reserved addresses
  • first and last IP address of each class is excluded since they are used as network broadcast addresses
    since I don't think this is worth implementing completely in a regular expression, a following pass should exclude the Intranet address space:
    10.0.0.0 - 10.255.255.255
    172.16.0.0 - 172.31.255.255
    192.168.0.0 - 192.168.255.255
    the loopback and the automatic configuration address space:
    127.0.0.0 - 127.255.255.255
    169.254.0.0 - 169.254.255.255
    while the local, multicast and and the reserved address spaces:
    0.0.0.0 - 0.255.255.255 (SPECIAL-IPV4-LOCAL-ID-IANA-RESERVED)
    224.0.0.0 - 239.255.255 (MCAST-NET)
    240.0.0.0 - 255.255.255.255 (SPECIAL-IPV4-FUTURE-USE-IANA-RESERVED)
    should already be excluded by the above regular expression.

This a very minimal list of tests to add to your testings:

PASS
"http://10.1.1.1",
"http://10.1.1.254",
"http://223.255.255.254"

FAIL
"http://0.0.0.0",
"http://10.1.1.0",
"http://10.1.1.255",
"http://224.1.1.1",
"http://1.1.1.1.1"

Need testing :)

There's absolutely nothing wrong with http://10.1.1.0 or http://10.1.1.255

Not every network is a /24

@washingtonintegritas
Copy link

washingtonintegritas commented Nov 13, 2020

(?:(?:https?://|[a-z0-9.]?)+(?:(?:[.]))+(?:[a-z]{2,3})+(?=\s|$))

@pollenflugkalender
Copy link

pollenflugkalender commented Jan 23, 2021

I have added simple network ranges validation, the rules I used are:

  • valid range 1.0.0.0 - 223.255.255.255, network adresses above and including 224.0.0.0 are reserved addresses
  • first and last IP address of each class is excluded since they are used as network broadcast addresses
    since I don't think this is worth implementing completely in a regular expression, a following pass should exclude the Intranet address space:
    10.0.0.0 - 10.255.255.255
    172.16.0.0 - 172.31.255.255
    192.168.0.0 - 192.168.255.255
    the loopback and the automatic configuration address space:
    127.0.0.0 - 127.255.255.255
    169.254.0.0 - 169.254.255.255
    while the local, multicast and and the reserved address spaces:
    0.0.0.0 - 0.255.255.255 (SPECIAL-IPV4-LOCAL-ID-IANA-RESERVED)
    224.0.0.0 - 239.255.255 (MCAST-NET)
    240.0.0.0 - 255.255.255.255 (SPECIAL-IPV4-FUTURE-USE-IANA-RESERVED)
    should already be excluded by the above regular expression.

This a very minimal list of tests to add to your testings:
PASS
"http://10.1.1.1",
"http://10.1.1.254",
"http://223.255.255.254"
FAIL
"http://0.0.0.0",
"http://10.1.1.0",
"http://10.1.1.255",
"http://224.1.1.1",
"http://1.1.1.1.1"
Need testing :)

There's absolutely nothing wrong with http://10.1.1.0 or http://10.1.1.255

Not every network is a /24

Yes, you are absolutely right. Is this defect still not fixed? I reported that bug a couple of years ago:

// old (wrong) RegExp excludes valid IPs ending with .0 or .255
// "(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))" +

// new: RegExp includes IPs ending with .0 or .255
"(?:\.(?:[0-9]\d?|1\d\d|2[0-4]\d|25[0-5]))" +

@7c
Copy link

7c commented Jan 26, 2021

https://rdap.nic.xn--1ck2e1b/domain/test.xn--1ck2e1b should be valid url but not validated

@ddelange
Copy link

ddelange commented Jan 27, 2021

@7c that is a punycode url. try converting it to unicode with an idna library before validating. the regex accepts almost all valid unicode URLs.

@dperini
Copy link
Author

dperini commented Jan 27, 2021

@ddelange,
thank you for explaining that so concisely and with an helpful description of an IDNA link to user @7c.
I believe these capabilities, accepting Unicode URLs, have passed unnoticed by most users having to do with "punycode" URLs.
Also thanks for your efforts with previous answer about this regexp not having TLD validation as it's scope.

@avalanche1
Copy link

avalanche1 commented Feb 7, 2021

@dperini, incorrectly returns true for "https://goo.gl.".
Edit: Hmm... it seems that chrome does accept that as a valid url. How so?

@dperini
Copy link
Author

dperini commented Feb 7, 2021

@avalanche1,
a trailing dot in a domain name is perfectly valid and accepted by all browsers not just chrome.
Also checkout the other linked posts outlined above by ddelange (there are more quirks about it).

@D0LLYNH0
Copy link

D0LLYNH0 commented Apr 30, 2021

@FANGOD,
only one does not match. https://regex101.com/r/dZBcOS/1

@ddelange
Copy link

ddelange commented Apr 30, 2021

Previously discussed punycode TLDs aside, the hypothetical www.4j above raises the question whether numbers in TLDs should be allowed by the regex (although there are currently none in the Public Suffix List)?

@TimNZ
Copy link

TimNZ commented Sep 10, 2021

Accepts a trailing period

http://x.comddfsdfsdf.

@jacobmischka
Copy link

jacobmischka commented Sep 26, 2021

Accepts a trailing period

http://x.comddfsdfsdf.

Up four comments: https://gist.github.com/dperini/729294#gistcomment-3623271

@intellent
Copy link

intellent commented Oct 13, 2021

Regarding the trailing period. For plain TLDs that’s totally fine. However, what about paths? Lets say you want to use this regex to link URLs in a given Text like this one:

For further info about matching URLs, visit https://gist.github.com/dperini/729294.

This will produce a link to https://gist.github.com/dperini/729294. (including the dot at the end) which won’t work.

Maybe one has to distinguish between a valid URL and a working URL.

@Synchro
Copy link

Synchro commented Oct 13, 2021

The trailing dot is indeed perfectly valid as part of the path. How you extract URLs from the text before validating them is an entirely separate problem that this lib should not attempt to address.

@intellent
Copy link

intellent commented Oct 14, 2021

The trailing dot is indeed perfectly valid as part of the path. How you extract URLs from the text before validating them is an entirely separate problem that this lib should not attempt to address.

Fair enough. You don’t happen to know of any lib to reliably extract URLs from texts?

@visusys
Copy link

visusys commented Nov 17, 2021

I've ported this to PowerShell and loving it! What an amazing validation script. If at all possible, is there an easy modification that would make the protocol optional so paths like the below are allowed:

www.google.com                   
github.com/PowerShell/PowerShell 
4sysops.com/archives/            
www.bing.com   

I've tried commenting out the initial protocol identifier "(?:(?:(?:https?|ftp):)?\/\/)" , but then paths with protocols don't get matched / validated.

Any help would be awesome.

@Synchro
Copy link

Synchro commented Nov 17, 2021

@visusys I'd recommend approaching it from the other direction – instead of adjusting the validator to allow invalid things, adjust your data. Check your own URLs and if they don't start with https://, add it on before validating. If necessary for your situation, remove it again afterwards.

@visusys
Copy link

visusys commented Nov 18, 2021

@visusys I'd recommend approaching it from the other direction – instead of adjusting the validator to allow invalid things, adjust your data. Check your own URLs and if they don't start with https://, add it on before validating. If necessary for your situation, remove it again afterwards.

I actually got it all figured out. All I needed to do was add a question mark at the end of the protocol identifier's non-capturing group:
(?:(?:(?:https?|ftp):)?\/\/)?

For anyone interested, I also ported the entire thing to PowerShell:
https://gist.github.com/visusys/1647c1a17ecfd4c305bfbf86b652084f

@pukster
Copy link

pukster commented Dec 23, 2021

How would one use this in a Postgres DB using the POSIX matching? This is a little "above my pay grade" as I am struggling with this

@cxytomo
Copy link

cxytomo commented Dec 27, 2021

try re_weburl.test('https://0.。。'). It returns true
@dperini

@suktec
Copy link

suktec commented Dec 27, 2021

[\w\d.-@]+?(com|net|cn|org|asp|php)([/\w.?=]+)*/i

@Iinksafe
Copy link

Iinksafe commented Dec 29, 2021

I suggest removing a character that causes the URL to be invalid: _.

/^(?:(?:(?:https?|ftp):)?\/\/)(?:\S+(?::\S*)?@)?(?:(?!(?:10|127)(?:\.\d{1,3}){3})(?!(?:169\.254|192\.168)(?:\.\d{1,3}){2})(?!172\.(?:1[6-9]|2\d|3[0-1])(?:\.\d{1,3}){2})(?:[1-9]\d?|1\d\d|2[01]\d|22[0-3])(?:\.(?:1?\d{1,2}|2[0-4]\d|25[0-5])){2}(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))|(?:(?:[a-z0-9\u00a1-\uffff][a-z0-9\u00a1-\uffff-]{0,62})?[a-z0-9\u00a1-\uffff]\.)+(?:[a-z\u00a1-\uffff]{2,}\.?))(?::\d{2,5})?(?:[/?#]\S*)?$/i

@bruceeewong
Copy link

bruceeewong commented Feb 16, 2022

really helpful, thx

@konstantinschuette
Copy link

konstantinschuette commented Apr 1, 2022

Are those valid urls? Because they are marked as invalid:
http://www.google.com/"asdf"
http://google.com//
http://google.com/asd//

@jxn-30
Copy link

jxn-30 commented Apr 1, 2022

http://www.google.com/"asdf"

No, as according to URL Standard, " is not a valid character in path segments (not an url code point). Only the percent-encoded form %22 is valid.

However:

http://google.com//
http://google.com/asd//

As far as I understand URL-path-segment-string, they should be valid:

zero or more URL units excluding U+002F (/) and U+003F (?), that together are not a single-dot path segment or a double-dot path segment

=> zero URL units (empty string) seems to be a valid URL-path-segment-string according to URL Standard and therefore these two examples should be valid URLs

@shangdev
Copy link

shangdev commented Apr 6, 2022

I suggest removing a character that causes the URL to be invalid: _.

/^(?:(?:(?:https?|ftp):)?\/\/)(?:\S+(?::\S*)?@)?(?:(?!(?:10|127)(?:\.\d{1,3}){3})(?!(?:169\.254|192\.168)(?:\.\d{1,3}){2})(?!172\.(?:1[6-9]|2\d|3[0-1])(?:\.\d{1,3}){2})(?:[1-9]\d?|1\d\d|2[01]\d|22[0-3])(?:\.(?:1?\d{1,2}|2[0-4]\d|25[0-5])){2}(?:\.(?:[1-9]\d?|1\d\d|2[0-4]\d|25[0-4]))|(?:(?:[a-z0-9\u00a1-\uffff][a-z0-9\u00a1-\uffff-]{0,62})?[a-z0-9\u00a1-\uffff]\.)+(?:[a-z\u00a1-\uffff]{2,}\.?))(?::\d{2,5})?(?:[/?#]\S*)?$/i

not support space

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment