XVGA TFT update

i made some progress with the XVGA controller board.

Current State of the Board

First, it becomes more expensive, because it had to buy a minimum of 5 of the FPD-Link transmitter chips (note: i sell the others. Interested? ;-). Then the layout is very tight. See here:
2012-07-18 autorouted board
ICs placement is nearly final, then i'll add some more hand-routed wires, auto-route again and then hand-optimize. This will take a week or two.
I had real problems with the rams and FPD-link controller, they actually just fit between the '245 bus transceivers and the VGA connector. I tested a couple of arangements, but this produced the least vias.
FYI: bottom left is the 16 bit K1 bus, directly above two 74245 bus transceivers, the SMD ICs are RAM and the FPD-Link transmitter. 6 chips next to the right are drivers and drivers with latches, which are used to select between an externally supplied address (for the CPU reading/writing the video RAM) or internal address from the counter cascade, used to address the video RAM for display. Next 'column' of chips are the external address registers/counters, next are the internal address counter cascade, the vertical chips at the right are clock and ATtiny. Bottom right 6 ICs are the control logic. There's an I2C EEPROM sitting on the rear side of the PCB underneath the bus connector.
Though i us a 15-pin VGA SUB-D connector, the signal is not VGA but transmits 4 LVDS signal lanes, each 3 wires: pos. and neg. differental signals and associated GND. In addition one PWM signal is transmitted on pin 15 which will control the LCD backlight brightnes. According to what i found in the net this is a 5V 125kHz PWM signal.

Current Circuit

Here's an update to the circuit as well:
Circuit 2012-07-18 - Data Paths

Circuit 2012-07-18 - Control Logics



Though i should finish the built LCDs first, i've already begun with 3rd display. I's a 1024x768 pixel TFT from my old iBook. It has a LVDS FPD-Link connection and i have searched the web for info about the panel and FPD-Link. I think i've got enough info to build it.

Major problems:

I need a special transmitter chip, preferably in 5V. These chips are generally hard to find (never used by hobbyists) and 5V is even harder. But i'll get a quote today. :-)
Timing is at the upper end of any hobbyists project: Pixel clock is 65 MHz, may be eventually lowered down to ~62 MHz.
This requires at least 15ns RAM, which will result in very tight timing, or better 12ns. And i need 1.5 MByte of it. Though i have plenty of RAM in stock, i opted to buy three 256Kx16Bit 12ns RAMs. Head count of ICs on the PCB is already very high.

Data flow on the XVGA controller
Control signals

Project Page

The project page is .../IO-Boards/VGA/ on my home site. This is on my private computer and everything i do here is directly visible on this page. Currently it contains a collection of spec sheets and the current state of the controller board design.

The Plan

The design ideas are as follows:
• The VRAM is addressed by a 20 bit counter cascade. Due to timing problems, the address is buffered by a set of 74574 latches, so the address is always one clock cycle delayed. The RAM output data is directly fed into the FPD-Link transmitter, which is clocked by the same clock signal. All running on 64MHz with a clock cycle of ~15ns.
• The slow signals, VSYNC, HSYNC and DE (Display Enable) are generated by an ATtiny. It also controls count enable of the address counters, to stop them during HSYNC and VSYNC (or, when DE is false). The ATtiny will be clocked with 16 MHz synchronously with the 64MHz pixel clock.
The ATtiny will also generate the FFB (frame fly back) interrupt signal, which is very important:
• VRAM access from the CPU will be completely asynchronously with the pixel access for the display. It will simply override the signals for the display, resulting in 'snow'. Each access will 'destroy' the display of approx. 3 pixels. This allows me accessing the VRAM without asserting the !WAIT signal on the bus. I have already checked the timing, writing is safe, reading is tight, but should work.
Accessing the VRAM requires sending an address and then one data i/o. The address is 20 bit, so it has to be transferred in two chunks. I opted to split the address in two 10 bit packages, which will directly translate into X and Y pixel address. To reduce the required bus transfers, i designed the address registers as counters as well. They will provide an auto-increment feature, so that i only need to set the start address and then can read or write in burst mode, hopefully with the full bandwidth of the bus of 16Mwords/sec. The X address can auto-increment, the Y address can auto-decrement as well. I probably can't make the X address easily auto-decrement, because i simply have not enough control lines to control this easily. The 'control lines' are the bus's address lines, and it has 6 of them.
To avoid the 'snow' effect when accessing the VRAM, i plan to do most i/o during the vertical frame flyback, which may be up to 10% of the total frame time. The exact maximum number of lines during ffb of my display will be determined when it is all built, therefore it's nice to have it programmable, because it's done by the ATtiny. It will be slightly tricky to align the control signals of the ATtiny with the 4-pixel boundary (ATtiny clock is Pixel clock ÷ 4) because the DE (display enable) signal for the FPD transmitter and the count enable signal for the address counter must not start and stop somewhere in the middle of a 4-pixel package but exactly at the start or end. Else the image on the TFT will be shifted some pixels, missing some at the left side and displaying garbage at the right side.

Let's see how it all works!


The K1-16/16 CPU

The self-built K1-16/16 CPU, built with standard 74xx CMOS ICs

The K1-16/16 CPU is the heart of the self-designed and home-built K1-16/16 Computer.
It is built with CMOS ICs from the 74AC series and fits on 5 Euro boards (160 x 100 mm).

Sometimes you are struck by an idea...

Due to depressions programming became harder and harder. So i thought, why don't do something more simple, with more manual work? Electronics, for instance. And, thanks to the internet, i have already read from other maniacs, who built a 6502 CPU. Or a Z80 in FPGA. Or Dennis Kuschel's myCPU. And there's a web ring about it. If others can do this, it can't be that hard. Basically...
Of course my CPU should be Different. Better. And Simple, so that i can understand it myself. B-)
For symmetry i settled with a 16/16 bit design: 16 data bits and 16 address bits.

Unusual and Generally Interesting Parameters

• Combined Harvard and Von Neumann architecture
16 MHz system clock
  Front panel with slow motion clock for exhibitions et. al.
  Full static design down to 0 Hz
16 bit internal data bus
16 bit internal address bus
• 64k x 16 bit internal ram
• 32k x 24 bit microcode
  organized as 2 code planes à 16k for conditional execution and branching.
  the microcode is copied from eproms to rams during boot for increased speed.
  it is also possible to load the microcode from an external source instead.
  the microcode implements:
    boot code, BIOS, kernel
    100++ assembler opcodes for ram-based programs
    100++ millicode opcodes for microcode-based forth or c-style programs
• No flag register. (but flags)
• Built with discrete logics using 74HCxx and 74ACxx ICs
  CPU fits on 5 "Euro" printed circuit boards (160 x 100mm)
Manual circuit design
  Manual routing of the PCBs (with EagleCAD)
  Professional made double-layer circuit boards

Harvard Architecture 

Programs can be written directly in microcode. Adopting this view, the K1 CPU has separated program and data memory. This is the Harvard Architecture.

Von Neumann Architecture

More likely, the CPU can also use a fixed microcode, which reads opcodes from the ram and executes them. Seen this way it has a combined program and data memory. This is the Von Neumann Architecture.

Start on blogger.com

I'm building a CPU for 4 years now (more or less) and accompanied this on my home page k1.spdns.de. This worked quite well but i wanted to separate the blog from the project documentation itself and i wanted to enable some feed back. So i started this blog on blogger.com. I will move some stuff in here which previously was on my website; i'll see whether i can fix the dates.

     ... Kio !


2nd Display

Going into mass production. ;-) I built a second, very similar LCD display which uses the same controller board. This one has a backlight, but it was CCFL. I had no inverter and building one and adapting it to the CCFL wold have taken too long and so i replaced it with an array of LEDs. Not good but working. See the photos on the LM64K101 - LCD Display 640x480 project page. Next is to debug the terminal software a little bit more and use it as output for the computer.


Debugging the LCD Display Driver and Hardware

This week i worked on the 640x480 pixel b&w LCD terminal.  Soldering was easy, but fixing all the bugs took some time. I also had  to do some changes to the terminal code because i realized that i was using the LCD upside down.

I connected the LCD and powered the board through the programming  header. Off course nothing worked, except for the display refresh routine, which brought up a picture of the erased DRAM cells. Step by  step i brought up more functions: Erase screen, print characters, read  and write whole pixel lines and scrolling. I had to add some nops to the DRAM read and write timing, because the data goes through series resistors which create some delay. For the next board i'll reduce them slightly. Then printing of standard-size characters with 4 attributes in all combinations works.

 Finally one last important step: Attach it via serial line to USB to my Mac. Nothing worked. I adjusted the Baudrate on both sides. I printed text from the LCD terminal on the serial line in an endless loop. There was no signal on the TxD line. Why? But there seemed to be a signal on the RxD handshake line. ... ???

I had connected data lines to handshake and handshake to data lines on the board. :-(. Fixed it with a cutter, solder and wire. Tested. Worked. :-)


Debugging i2c

And on it goes. The last days i spent finding out, why the i2c interface did not work. Hardware is ok, it's a problem with the software. First i thought, all i2c eeproms have a block size of 64 bytes. But that's not that easy. Block size varies with eeprom size. And manufacturer… :-| But most 8k eeproms have blocks of 32 bytes and most 16k and 32k eeproms have blocks of 64 bytes.

Then busy polling after a block write did not work properly. I started a read cycle to detect the busy state (the eeprom does not respond if busy) but this behaviour is only defined if you start a write cycle. So i rewrote my source to do this.

Then somehow my start and stop sequences on the i2c bus sometimes failed. Now i test-read the data line to see if it is high and not pulled low by the target, for whatever reason.

Now access to the K1 bus eeproms works properly and hopefully reliably. I successfully downloaded the SIO driver code into the SIO driver eeprom and booted the CPU with driver initialization from eeproms, and not with data from the microcode. And it works! :-)

And, last not least, selecting the i2c eeprom on a K1 bus card works as planned: All eeproms are addressed with address 0b000, but only the eeprom on the currently selected extension card actually has this address. The i/o cards disable their eeprom while they are not selected, and the easiest way to do this is to set any address pin of the eeprom to '1', so that the address does not match.


Debugging the SIO board

Instead of writing some 'useful' commands i spend the last days tracking down a weird error. When i activated the timer interrupt of the 88C192 UART to give me a system timer interrupt of 100 Hz, the whole system got stuck. The simulator worked, the real hardware stalled. Bad!

It took me some time with the minimalistic debugging facilities to come to the conclusion, that the interrupt does not go away. When i rewrote the RETI (return-from-interrupt) opcode to execute the next  opcode regardless of interrupt state the system made it to the shell prompt, though eating all the cpu power. This only happened when i enabled the timer interrupt, else everything seemed to be ok. I double checked my code three times. Have i a broken UART? Is the documentation wrong?
Finally, when i programmed the UART to the longest possible duration, this left ~90% cpu time in the 'Halt' state (at 8MHz). Some observations later i discovered, that interrupts came in bursts. I measured the burst rate: ~3.5Hz. I did a calculation of the interrupt frequency: 7.372MHz/32/0xFFFF = 3.515Hz. This proved, that the timer interrupt generated the bursts.

But why did the interrupt stay active for ~1/35 sec (given the 10% cpu usage) and then go away? I examined my circuit design very carefully ... VERY carefully ... this whole thing looke like a ... and there i got it: The /INT output of the UART is open collector and i had no pull-up resistor fitted there! Now all observations made sense. I soldered a 5kΩ resistor between 2 suiting pins, restored the SIO eeprom driver, compiled the microcode rom, uploaded it to the front panel, resetted the cpu from the front panel eeproms and yepp, it worked! :-) The timer interrupt now eats approx. 1% (at 8MHz).


Debugging the CPU


I'm now moving back from emulation to the real hardware. I expected problems, and there they are.
Somehow upload of microcode files to the front panel did not work. First it hangs with XOFF, so i disable XON/XOFF and lower the baud rate. Then it transfers up to address 0x4A00 and aborts with an error. Each time i try to do 'something' to find out the cause the error changes. Finally i find out that at that position is the first unused gap in the microcode and obviously refresehing the LCD is so time consuming that even the longest delay after one line of code (100ms) is not enough. Strange, when i re-enabled XON/XOFF it worked. So microcode upload to the CPU is working again.
Fundamental Timing of the K1-16/16 CPU
Now the code does not make it beyond the initial register test. Loading the SR (shift-right) register failed to load the CY input to data bit D15. I remember that there was a problem with exactly this when i tested it more than one year ago. I had settled my mind it was a contact issue and tried to prove this now again. But it isn't. Actually the little chart with the fundamental timing of the CPU contains the secret:
The blue clk is the load signal for the registers – SR is a register – and the rising edge is where it latches new data from the data bus and, in case of data bit D15, from the CY line. The CY line is a 'option control' line, it controls options in the various registers, if they have. As can be seen in the image the option control line toggles exactly at the same moment (if it toggles) when the clk line raises. So we have a race condition here.
The only thing i can do is to delay the CY option control line for the shift registers. I examined whether this problem will show up at other places too, but it seems not. Whereever CY (or any other option control line) is used, there are some gate delays between the option control line and the data latching register. So it's enough to delay this signal for SR and SL only. Luckily there is an unused OR gate on the data registers board which i will use for this.


All test code performed without error. :-) Except for i2c test, because i have not yet attached any io device.


Weekend … phantastic weather … what to do? ok, i manufactured the K1-system-bus. ;-)


Tata! The SIO board works. Today i got the console prompt on the terminal. There are still some issues to examine: Speaking to the i2c eeprom did not work. The hardware seems to work properly, but the eeprom does not respond. This is to be investigated. Whether interrupts work as expected is still to be tested. Though they seem to work.


Yep. Interrupts obviously work. SIO works: i/o to serial to my Mac works and i/o to the front panel works. Now i'll have to add some extras to the terminal in the frontpanel, e.g. handle some control codes. ;-) And then add some stuff to the boot shell of the computer, like cd, ls and so on. And investigate the i2c problem.


SIO, IDE and Terminal


SIO board
I have used the wrong SO-8 package in the drawings for the SIO and IDE board and had to place the I2C EEprom very carefully on the PCB to solder it. The pins extended beyond the solder pads, but it seemed to work.

2012-03-17: Heart attack: i took a look at the 88C192 SIO's PDF and saw that pins were numbered starting at a corner. X-( .... but this was for the TQFP-44 package only. Pin numbers of the PLCC package start in the middle of one side. HTF were these brain dead idiots... ?

IDE board
Soldering the 50 CF card adapter pins went quite well: Some flux applied in advance, soldering the pins with as little tin as possible, ignoring all junctions and blobs of tin and then removing most of the solder tin with desoldering wick. perfect. Stacking of the CF card and an IDE Flash drive works as desired.
LCD driver board
I hope everything will work. Testing starts ... soon. :-]
 2012-03-26: The PCB for the above mentioned LCD display project arrived. Yet another board to test and to write software for. But this one is easier.


SIO and IDE Cards

After a long time where i played around with the C-style compiler for the microcode, which i expanded to a virtual instruction code compiler for use on my Mac, i resumed work on the half-sized IDE interface and the half-sized serial adapter. The IDE interface will connect two devices, a 96 MB DiskOnModule with a 40 pin IDE connector and a CF adapter for removable drives. The serial adapter will only be used temporarily for the K1-CPU itself. It will be replaced by a 4 channel (or evtl. more) serial card later which will also incorporate a LAN adapter module, which i own. The serial ports can be used for asynchronous communication with SW or HW handshake or with clock signals of any polarity. XOR gates are useful programmable inverters. ;-) The 88C192 chips have idiotic restrictions for the selectable bit rates. :-|
The two boards have been ordered on 2012-02-04 from Leiton, like the others.
Related to the CPU project is another project which i was working upon the last two weeks: A controller board for an old black&white reflective LCDisplay i own. It is based on an ATMega8 and uses old 41464 DRAM chips for it's frame buffer.
20212-02-22: The IDE and SIO boards arrived. I got two of each, due to overproduction, for fair additional costs. But the boards don't look good this time. Like processed in over-aged chemicals or the like. I don't know.