Skip to content

Instantly share code, notes, and snippets.

Created March 19, 2013 04:44
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save anonymous/5193769 to your computer and use it in GitHub Desktop.
Save anonymous/5193769 to your computer and use it in GitHub Desktop.
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta name="robots" content="noindex" />
<link rel="canonical" href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/" />
<title>Print Page - Hacking NVidia Cards into their Professional Counterparts</title>
<style type="text/css">
body, a
{
color: #000;
background: #fff;
}
body, td, .normaltext
{
font-family: Verdana, arial, helvetica, serif;
font-size: small;
}
h1#title
{
font-size: large;
font-weight: bold;
}
h2#linktree
{
margin: 1em 0 2.5em 0;
font-size: small;
font-weight: bold;
}
dl#posts
{
width: 90%;
margin: 0;
padding: 0;
list-style: none;
}
dt.postheader
{
border: solid #000;
border-width: 1px 0;
padding: 4px 0;
}
dd.postbody
{
margin: 1em 0 2em 2em;
}
table
{
empty-cells: show;
}
blockquote, code
{
border: 1px solid #000;
margin: 3px;
padding: 1px;
display: block;
}
code
{
font: x-small monospace;
}
blockquote
{
font-size: x-small;
}
.smalltext, .quoteheader, .codeheader
{
font-size: x-small;
}
.largetext
{
font-size: large;
}
.centertext
{
text-align: center;
}
hr
{
height: 1px;
border: 0;
color: black;
background-color: black;
}
</style>
<script type="text/javascript"><!-- // --><![CDATA[
var _gaq = _gaq || [];
_gaq.push(['_setAccount', 'UA-13007638-2']);
_gaq.push(['_trackPageview']);
(function() {
var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
})();
// ]]></script>
</head>
<body>
<h1 id="title">EEVblog Electronics Community Forum</h1>
<h2 id="linktree">Electronics => Projects, Designs, and Technical Stuff => Topic started by: gnif on March 15, 2013, 03:05:16 PM</h2>
<dl id="posts">
<dt class="postheader">
Title: <strong>Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 15, 2013, 03:05:16 PM</strong>
</dt>
<dd class="postbody">
Hi All,<br /><br />I did originally post this on the nvidia forums but they have silently deleted it&nbsp; :--, obviously they do not like what I have found becoming public&nbsp; &gt;:D.<br /><br />Firstly I will give a bit of history for those that are unaware. NVidia&#039;s has for a long time had two ranges of cards, the GeForce for the gaming market, and Quadro for the professional market, and more recently the Tesla range for high end parallel computing stuff. As I am sure most of you would be aware, it is cheaper to manufacture a single chip and cripple it in some way for different product lines then it is to make different silicon for every product.<br /><br />In the past it has been possible to convert the GeForce cards into Quadro if you could find what they call &#039;hardware straps&#039; on the board and change them. These straps control the PCI Device ID that the card reports to the computer, and as such, what the drivers will allow the card to do. Recently nVidia changed the way this all works and it has not been possible for quite a few generations of cards until someone on the nVidia forums discovered that the GeForce 4xx something can be turned into its higher end card by changing the hardware strap values by means of an undocumented override in the EEPROM. They were quick to disable this by changing the drivers to look at only the hardware straps for the PCI ID.<br /><br />I own a NVidia GTX 690 which I bought for two reasons, gaming, and multi monitor setup for work, NVidia made it very clear that this card would drive up to 3 screens in 2d, which it does quite nicely&nbsp; :-+... under windows&nbsp; :--! The tight asses have decided that if you want this feature under Linux you have to get a Quadro which has Mosaic support&nbsp; :palm:. So naturally I decided to look at how mod the card, as the price difference is over $1000 between the GTX 690 and the Quadro K5000 (same GPU) and, get this... the K5000 is only single GPU and clocked some 25-30% slower then the gaming card, what a joke :-DD.<br /><br />What NVidia has done is changed the way that it handles the straps, instead of just pulling the straps high or low to control the switches as they did previously, they are now read as analogue values. The scheme is as follows:<br /><br />When pulling high:<br /><br />5K&nbsp; &nbsp;= 8<br />10K = 9<br />15K = A<br />20K = B<br />25K = C<br />30K = D<br />35K = E<br />40K = F<br /><br />When pulling low I expect this to be the same, but for 7 - 0, but I did not test this as the device ID I was targeting is &gt;= 8.<br /><br />There are two tiny SMD resistors on the board, one for each nibble of the PCI Device ID byte. Originally the GTX 690 has a device id of 0x1188, so to become a Quadro K5000 this has to be changed to 0x11BA, which equates to 20K and 15K resistors. If you wanted to change it to a Tesla K10, you would want to change it to 0x118F, which equates to 5K and 40K resistors.<br /><br />This will only change the rear GPU on the GTX 690, I am yet to identify the resistors to change for the front one. I would also wager a bet that the new NVidia Titan can be upgraded into the Tesla K20 using the same method.<br /><br />Anyway, enough with the description, here are the photos of what to change:<br />(https://files.spacevs.com/gtx%20690%20back.jpg)<br />(https://files.spacevs.com/gtx%20690%20mod.jpg)<br /><br />And the results:<br />(https://files.spacevs.com/quatro.png)<br /><br />(https://files.spacevs.com/k10.png)<br />
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Strada916</strong> on <strong>March 15, 2013, 03:55:17 PM</strong>
</dt>
<dd class="postbody">
Great work there gnif. Have they sorted out the &quot;Your video card has stopped responding.&quot;&nbsp; |O errors. I thought for ages it was my system. Until 6 months ago when I started to work for a new company who have all Lenovo computers. These machines too gave this error message. This was all under windows7. Sorry I diress, great work there non the less.&nbsp; :-+
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 15, 2013, 03:58:04 PM</strong>
</dt>
<dd class="postbody">
<div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg202917/#msg202917">Quote from: Strada916 on March 15, 2013, 03:55:17 PM</a></div></div><blockquote class="bbc_standard_quote">Great work there gnif. Have they sorted out the &quot;Your video card has stopped responding.&quot;&nbsp; |O errors. I thought for ages it was my system. Until 6 months ago when I started to work for a new company who have all Lenovo computers. These machines too gave this error message. This was all under windows7. Sorry I diress, great work there non the less.&nbsp; :-+<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />Thanks :) Aparrently it is fixed in the later drivers, but I doubt it, I had to go to a 6 month old driver to get away from all sorts of issues with SLI and surround vision under windows.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>amyk</strong> on <strong>March 15, 2013, 10:52:53 PM</strong>
</dt>
<dd class="postbody">
Do you mean the &quot;front&quot; one, are the strapping resistors really closer to the other GPU, or is your computer back to front (http://www.electricstuff.co.uk/backwardspc.html)?<br /><br />Carefully comparing the PCBs (or suitable images thereof) of the 690 with the K5000 might yield useful results.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 15, 2013, 10:59:33 PM</strong>
</dt>
<dd class="postbody">
<div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203026/#msg203026">Quote from: amyk on March 15, 2013, 10:52:53 PM</a></div></div><blockquote class="bbc_standard_quote">Do you mean the &quot;front&quot; one, are the strapping resistors really closer to the other GPU, or is your computer back to front ([url]http://www.electricstuff.co.uk/backwardspc.html[/url])?<br /><br />Carefully comparing the PCBs (or suitable images thereof) of the 690 with the K5000 might yield useful results.<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br /><br />Sorry, to be clear, when looking at a card, the rear connectors I have always seen as the front of the card. To be specific, it is GPU2 that can be changed. Comparing to a K5000 will not help as it does not have two GPUs on it. Comparing to a Tesla K10 may help as the use the same reference design.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>airthimble</strong> on <strong>March 16, 2013, 01:37:53 AM</strong>
</dt>
<dd class="postbody">
gnif,<br /><br />I saw your post on the cuda developer forums, which got deleted, then I tracked your username to overclock.net and finally made my way here. I am interested in exploring this with some of the single gpu cards. I am curious as to how you found this mod, were you able to get the schematics for the nvidia reference design?
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Ed.Kloonk</strong> on <strong>March 16, 2013, 03:24:05 AM</strong>
</dt>
<dd class="postbody">
Ole Linus has had a bit to say in regards to Nvidia. And he&#039;s dead right. A while back Nvidia was the shit for Linux with it&#039;s tolerable binary blob drivers. But once they got a rep for being Linux friendly, they seemed to do everything possible to back away from that position. Maybe M$ scared them?<br /><br />Further, I have been watching the history of the Amiga computer and the demise of Commodore. How great hardware combined with a great user community can still be destroyed by a few business-men fools.<br /><br />IMO, nvidia is following in Commodore&#039;s footsteps and will probably end up just as bankrupt also.<br /><br />
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>olsenn</strong> on <strong>March 16, 2013, 03:29:00 AM</strong>
</dt>
<dd class="postbody">
You&#039;ve got balls for taking a soldering iron to a GTX-690!
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Kaluriel</strong> on <strong>March 16, 2013, 03:33:23 AM</strong>
</dt>
<dd class="postbody">
I didn&#039;t even realise nVidia was doing this. Years ago it use to just be broken parts that were disabled, and now they&#039;re just purposely disabling parts. For shame
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 16, 2013, 08:42:46 AM</strong>
</dt>
<dd class="postbody">
<div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203086/#msg203086">Quote from: airthimble on March 16, 2013, 01:37:53 AM</a></div></div><blockquote class="bbc_standard_quote">gnif,<br /><br />I saw your post on the cuda developer forums, which got deleted, then I tracked your username to overclock.net and finally made my way here. I am interested in exploring this with some of the single gpu cards. I am curious as to how you found this mod, were you able to get the schematics for the nvidia reference design?<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />No, no schematic, what I did was look for resistors that looked like they had an alternative position, have a look at the photos and you will see what I mean. Any that I suspected of being a strap I used a meter to check if the resistor was connected to ground of 3.3V directly, and looked where the general traces were going in the area. If they went towards the GPU and connected to one of the rails it was a pretty good bet that it was a hard strap.<br /><br /><div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203102/#msg203102">Quote from: Ed.Kloonk on March 16, 2013, 03:24:05 AM</a></div></div><blockquote class="bbc_standard_quote">Ole Linus has had a bit to say in regards to Nvidia. And he&#039;s dead right. A while back Nvidia was the shit for Linux with it&#039;s tolerable binary blob drivers. But once they got a rep for being Linux friendly, they seemed to do everything possible to back away from that position. Maybe M$ scared them?<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />I watched that a few days ago, couldn&#039;t agree more with him.<br /><br /><div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203104/#msg203104">Quote from: olsenn on March 16, 2013, 03:29:00 AM</a></div></div><blockquote class="bbc_standard_quote">You&#039;ve got balls for taking a soldering iron to a GTX-690!<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />No, just anger at NVidia for not clearly advertising that surround does not work under linux.<br /><br /><div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203106/#msg203106">Quote from: Kaluriel on March 16, 2013, 03:33:23 AM</a></div></div><blockquote class="bbc_standard_quote">I didn&#039;t even realise nVidia was doing this. Years ago it use to just be broken parts that were disabled, and now they&#039;re just purposely disabling parts. For shame<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />Agreed, NVidia are the Microsoft of the hardware industry.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 16, 2013, 09:06:16 AM</strong>
</dt>
<dd class="postbody">
I would wager a bet that these are the straps on the EVGA 670 and 680 (they use the same PCB), I would say the top set will me more likely to contain the PCI Deivce ID straps. The lower set I would only investigate if I couldn&#039;t find it it in the top set as they look less like straps.<br /><br />A quick and simple way I found to test was to take a 2.2K resistor, tack the wire onto the board without removing the resistor and pulling it to ground or 3.3V+. Make a DOS bootable USB device and throw nvflash onto it, then running &#039;nvflash -a&#039; will show you what the current ID is without having to boot all the way into windows/linux.<br />
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 16, 2013, 09:10:33 AM</strong>
</dt>
<dd class="postbody">
And these might be them on the EVGA GTX 660
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>toster</strong> on <strong>March 17, 2013, 10:34:39 PM</strong>
</dt>
<dd class="postbody">
Hum.. what happens if I get the device id wrong? If the driver doesn&#039;t work with the chip that&#039;s on the card? Does it simply fall back to &quot;software rendering&quot; or can something worse happen?<br /><br />I have a 9600GT I could mess around with, however the chip is G94b. It looks like the counterpart is Quadro FX 1800 which has a G94. The missing &#039;b&#039; bothers me.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Zibri</strong> on <strong>March 19, 2013, 04:41:31 AM</strong>
</dt>
<dd class="postbody">
hmm very interesting...<br />I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 &gt;&gt; 680&nbsp; &nbsp;or 560/570 &gt;&gt; 580).<br />
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>gnif</strong> on <strong>March 19, 2013, 05:18:37 AM</strong>
</dt>
<dd class="postbody">
<div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg203656/#msg203656">Quote from: toster on March 17, 2013, 10:34:39 PM</a></div></div><blockquote class="bbc_standard_quote">Hum.. what happens if I get the device id wrong? If the driver doesn&#039;t work with the chip that&#039;s on the card? Does it simply fall back to &quot;software rendering&quot; or can something worse happen?<br /><br />I have a 9600GT I could mess around with, however the chip is G94b. It looks like the counterpart is Quadro FX 1800 which has a G94. The missing &#039;b&#039; bothers me.<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />Windows will just start up in VGA mode as it does not know the card, there is no risk to the card other then botching your soldering.<br /><br /><div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg204327/#msg204327">Quote from: Zibri on <strong>Today</strong> at 04:41:31 AM</a></div></div><blockquote class="bbc_standard_quote">hmm very interesting...<br />I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 &gt;&gt; 680&nbsp; &nbsp;or 560/570 &gt;&gt; 580).<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />Unknown, you would have to test it, I do know however that you can change the device ID to make a card become a 670, but do not know if it disables/enabled cores.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>SeanB</strong> on <strong>March 19, 2013, 06:25:16 AM</strong>
</dt>
<dd class="postbody">
Normally the parts that fail full QC are used to make the lower spec units, but often as process yield improves the rejects become fewer, so they start using full spec devices instead, and just disable them. Normally this is done with a chip level probe test and zener zapping an internal fuse. You could in the old days change certain Intel and AMD parts back as they had either a pin strapping or an on top of case jumper that was cut to do options.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>HAL-42b</strong> on <strong>March 19, 2013, 08:22:16 AM</strong>
</dt>
<dd class="postbody">
Here is where all the traffic is coming from. In case you were wondering.<br /><br />http://www.reddit.com/r/linux_gaming/comments/1aj3n3/not_cool_nvidia/ (http://www.reddit.com/r/linux_gaming/comments/1aj3n3/not_cool_nvidia/)<br /><br />And thanks OP for the brilliant writeup.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>cloudscapes</strong> on <strong>March 19, 2013, 10:42:03 AM</strong>
</dt>
<dd class="postbody">
It&#039;s on hackaday as well, gratz!<br /><br />http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/ (http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/)<br /><br />This has been going on for a while. I remember a decade ago you could some of the better Geforce 2&#039;s into a Quadro 2. I also heard years ago that Quadros, though better at workstation stuff like CAD and 3D packages, might offer inferior performance in gaming. Like they trade off the speed for greater precision when going Geforce-&gt;Quadro, or something like that. A higher definition z-buffer, or something. I&#039;m unsure if that&#039;s still the case, if it ever was.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Zibri</strong> on <strong>March 19, 2013, 01:33:37 PM</strong>
</dt>
<dd class="postbody">
hmm sure, but I bet such &quot;fuses&quot; are outside the chip.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Omnipotent_Galaxy</strong> on <strong>March 19, 2013, 03:12:41 PM</strong>
</dt>
<dd class="postbody">
hi gnif, <br /><br />I am wondering if the same method will be useful in mobility Geforce cards since I own a Clevo GTX675MX card of GK104 and I&#039;m looking forward to a K4000m card with the same CUDA cores. <br /><br />Though some players have announced their successful soft-adjustments from 675MX to K4000m by VBIOS or others, I&#039;m still searching for a way to make a long-term stable work. <br /><br />Anyway, the price of K4000m with varranty is its worst part maybe, and many users are using ES cards or other cards from factories without varranty... What a pity.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>EEVblog</strong> on <strong>March 19, 2013, 03:15:53 PM</strong>
</dt>
<dd class="postbody">
Two Hack-A-Days from GNIF in almost as many days!&nbsp; :-+<br />What will he hack next?&nbsp; :-//<br /><br />Dave.
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>Omnipotent_Galaxy</strong> on <strong>March 19, 2013, 03:21:32 PM</strong>
</dt>
<dd class="postbody">
<div class="quoteheader"><div class="topslice_quote"><a href="http://www.eevblog.com/forum/projects/hacking-nvidia-cards-into-their-professional-counterparts/msg204327/#msg204327">Quote from: Zibri on <strong>Today</strong> at 04:41:31 AM</a></div></div><blockquote class="bbc_standard_quote">hmm very interesting...<br />I wonder if a similar approach can enable disabled cores on x70 cards to make them x80 (660/670 &gt;&gt; 680&nbsp; &nbsp;or 560/570 &gt;&gt; 580).<br /></blockquote><div class="quotefooter"><div class="botslice_quote"></div></div><br />Some GTXx70/x60 cards do use the original CUDA core with the GTXx80 cards but there is likely to be some problems with the &#039;hidden&#039; parts of the core anyway. <br /><br />This generation of NV cards uses the GK104 core from 576sp cards (mobility ones of K3000m) to 1344sp cards. It is hard to imagine the power supply of GTX660Ti can be enough for GTX680... <br /><br />Though with great expectation, Kepler is not like Fermi... GTX560Ti 448sp was just a dream maybe?
</dd>
<dt class="postheader">
Title: <strong>Re: Hacking NVidia Cards into their Professional Counterparts</strong><br />
Post by: <strong>jerry507</strong> on <strong>March 19, 2013, 03:42:26 PM</strong>
</dt>
<dd class="postbody">
It wasn&#039;t quite clear to me, did you just turned your 690 into a K5000? I understand that you wanted the multiple monitor support for linux, but you said you also bought it for gaming. The K5000 is significantly worse in that respect, is there something about this hack that preserves that?<br /><br />I suppose it&#039;s a question of priorities.
</dd>
</dl>
<div id="footer" class="smalltext">
<span class="smalltext" style="display: inline; visibility: visible; font-family: Verdana, Arial, sans-serif;"><a href="http://www.eevblog.com/forum/index.php?action=credits" title="Simple Machines Forum" target="_blank" class="new_win">SMF 2.0.4</a> |
<a href="http://www.simplemachines.org/about/smf/license.php" title="License" target="_blank" class="new_win">SMF &copy; 2013</a>, <a href="http://www.simplemachines.org" title="Simple Machines" target="_blank" class="new_win">Simple Machines</a><br /><span class="smalltext"><a href="http://www.smfads.com" target="_blank">SMFAds</a> for <a href="http://www.createaforum.com" title="Forum Hosting">Free Forums</a></span>
</span>
</div>
</body>
</html>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment