why am I getting a precise bus fault exception (PRECISERR) on what looks like a perfectly fine aligned access (cortex-m7)
I'm getting a HardFault that results from a forced/escalated Precise Bus Fault Exception, as indicated by the PRECISERR
bit in the BFSR
register, and I can't seem to figure out why it is occurring. The exception occurs from within vendor-supplied startup code that previously executed fine, and I cant see any alignment or memory-related issues.
The offending instruction is ldrlt r0, [r1], #4
on the first iteration through the loop, where the value stored in r1
is 0x00040458
The full instruction sequence is shown below, where other relevant symbols used in r2
and r3
are defined in the comments
/* Loop to copy data from read only memory to RAM. The ranges
* of copy from/to are specified by following symbols evaluated in
* linker script.
* __etext: End of code section, i.e., begin of data sections to copy from.
* __data_start__/__data_end__: RAM address range that data should be
* __noncachedata_start__/__noncachedata_end__ : none cachable region
* copied to. Both must be aligned to 4 bytes boundary. */
ldr r1, =__etext /* equal to 0x00040458 */
ldr r2, =__data_start__ /* equal to 0x20000000 */
ldr r3, =__data_end__ /* equal to 0x20000224 */
.LC0:
cmp r2, r3
ittt lt
ldrlt r0, [r1], #4 /* <---- exception triggered here */
strlt r0, [r2], #4
blt .LC0
The offending address listed in BFAR
is 0x00040458
, which corresponds to the value in r1
and is a perfectly valid 32-bit aligned address within the ITCM region (0x0 --> 0x0007FFFF).
Not sure what else could be causing this exception if the memory access itself looks fine. The exception was introduced by expanding the .text
section in my linker file, as shown below
MEMORY
{
m_interrupts (RX) : ORIGIN = 0x00000000, LENGTH = 0x00000400
m_text (RX) : ORIGIN = 0x00000400, LENGTH = 0x00074000 /* changed from LENGTH = 0x0003FC00 */
m_data (RW) : ORIGIN = 0x20000000, LENGTH = 0x00020000
m_data2 (RW) : ORIGIN = 0x20200000, LENGTH = 0x00020000
}
If it isn't an alignment issue, I'm not sure what it could be? But 0x00040458
is most definitely word-aligned, as is 0x0004045C
which results from the #4
offset to the ldr
instruction.
Also, why is 0x0004045C
not shown in BFAR
, since the cortex-m7 TRM says the ldr
instruction applies the offset to the target register value before the memory access occurs??
Full exception registers shown below for completeness
See also questions close to this topic
-
C# Api image to picturebox from url. Exception thrown
I am trying to create a little application and display images of cats from thecatapi.com. I call CATS() on a timer every 10 seconds. The code runs fine for 3-5 images then i get "System.ArgumentException: 'parameter is not valid.'" Ive tried to use try/catch but it still throws an exception. Pointers as to a solution are much appreciated.
string catURL; string catJSON; private async void CATS() { var client = new HttpClient(); HttpResponseMessage response = await client.GetAsync("https://api.thecatapi.com/v1/images/search?api_key=<apiKey>"); HttpContent responseContent = response.Content; using(var reader = new StreamReader(await responseContent.ReadAsStreamAsync())) { catJSON = (await reader.ReadToEndAsync()); } dynamic items = JsonConvert.DeserializeObject(catJSON); foreach(var item in items) { catURL = Convert.ToString(item.url); } try { if (pictureBox1.Image != null) pictureBox1.Image.Dispose(); pictureBox1.Load(catURL); } catch {} }
-
Java: "Win32Exception: The parameter is incorrect."
The parameter is incorrect
Hi, I am making an application that displays ur saved browser passwords (right now I'm using Google Chrome) in an easy way. Everytime I run this code I get an error at
byte[] newbyte = Crypt32Util.cryptUnprotectData(mybyte);
. The code used is written below. This code provides some context. I never had this problem and after some research I can't find a clear solution. I hope someone can help me with it.Code:
Connection connection = null; connection = DriverManager.getConnection("jdbc:sqlite:" + path_to_copied_db); PreparedStatement statement = connection.prepareStatement("SELECT `origin_url`,`username_value`,`password_value` FROM `logins`"); ResultSet re = statement.executeQuery(); StringBuilder builder = new StringBuilder(); while (re.next()) { String pass = ""; try { byte[] mybyte = (byte[])re.getBytes("password_value"); byte[] newbyte = Crypt32Util.cryptUnprotectData(mybyte); //Error on this line:71 pass = new String(newbyte); }catch(Win32Exception e){ e.printStackTrace(); } builder.append(user + ": " + re.getString("origin_url") + " " + re.getString("username_value") + " " + re.getBinaryStream("password_value") + "\n"); }
Error:
com.sun.jna.platform.win32.Win32Exception: The parameter is incorrect. at com.sun.jna.platform.win32.Crypt32Util.cryptUnprotectData(Crypt32Util.java:128) at com.sun.jna.platform.win32.Crypt32Util.cryptUnprotectData(Crypt32Util.java:103) at com.sun.jna.platform.win32.Crypt32Util.cryptUnprotectData(Crypt32Util.java:90) at Client.Client.main(Client.java:71)
-
Multiple Replace in VBA
Please, if anyone can help me, I will be very grateful. How do I replace multiple words in worksheet using VBA code, words like: Da...da, Do...do, Dos, De... de.. etc. I tried (but didn't know how to adapt it in my spreadsheet called "Customers3") ...>
Public Function MyProper(MyString As String, Optional exceptions As Variant) Dim c As Variant If IsMissing(exceptions) Then exceptions = Array("a", "as", "e", "o", "os", "da", "das", "de", "di", "do", "dos", _ "CPF", "RG", "E-Mail") End If MyString = Application.Proper(MyString) For Each c In exceptions MyString = Replace(" " & MyString & " ", " " & c & " ", " " & LCase(c) & " ", , , vbTextCompare) Next c MyProper = MyString End Sub
''Note: I am from Brazil and I really admire the great knowledge of all of you. I am learning.
-
Equivalent for .hidden
In assembly languages are there equivalents to
.hidden
? I know onApple
there is.private_extern
but what about other variants. For reference, I am onWindows
trying to compile some assembly usingmsys2
. -
Trouble finding output in ARMsim
Hi I'm currently working on a project and I feel dumb for not knowing this, but I'm having trouble seeing some outputs in my code. I've attached an image of what I'm seeing below and I've checked the console, stdout, and watchView and nothing is showing. Is there something I didn't include in my code or something I'm overlooking?
Edit: I realize it might be important to say that my end goal is to analyze the integers in integer.dat and look at stuff like the total number of integers in the file and any number of occurrences of the same integer(if any)
.equ SWI_Open, 0x66 @open file .equ SWI_Close,0x68 @ close file .equ SWI_PrChr,0x00 @display an ascii char to stdout .equ SWI_PrStr,0x69 @write string to file or stdout .equ SWI_PrInt,0x6b @write an integer to file .equ SWI_RdInt,0x6c @read an integer from a fike .equ Stdout,1 @set output goal to be stdout .equ SWI_Exit, 0x11 @prevent execution .global_start: .text @print an preliminary message to the screen mov R0,#Stdout @print an initial message ldr R1, = Message1 @load cope with of message1 label swi SWI_PrStr @display message to stdout @------------open file---------- ldr r0, = integerFile @set call for input document mov r1,#0 @mode is input swi SWI_Open @open record for input bcs InFileError @check bring-bit(C): if =1 then error @--------file error handling------------ InFileError: mov R0,#Stdout ldr R1, = errMsg swi SWI_PrStr integerFile: .asciz "input.dat" errMsg: .asciz "failed to open input.dia" Message1: .asciz "Hello\n"
-
RISV-V Assembly language. Only printing zero
The following code is supposed to take two lists and put them into one big list (C[10]) and being a beginner in assembly I am unsure as to how to check if my output is correct. I tried implementing a print to check if the left half works but it only prints zero. (https://www.kvakil.me/venus/ using this website as an emulator)
.data A: .word 0,2,3,3,4 B: .word 24,22,21,25,23 C: .word 1,1,1,1,1,1,1,1,1,1,1 min: .word 0 .text .globl _main _main: add x8,x8,x0 #x8=i=0 addi x9,x9,5 #x9=5 #start of the loop loop: bge x8,x9,exit add x18,x0,x8 #x18=i slli x18,x18,2 #x18=i*4 addi x19,x18,20 #x19 =i*4+ 4*5 la x20,A #x20=&A la x21,B #x21=&B la x22,C #x22=&C add x20,x18,x20 #x20=&A[i] lw x20,0(x20) #x23=A[i] add x21,x19,x21 #x21=&B[i] lw x21,0(x21) #x24=B[i] add x23,x22,x19 #x23=&C[i+5] sw x23,0(x21) #C[i+5]=B[i] add x22,x18,x22 #x22=&C[i] sw x22,0(x20) #C[i]=A[i] addi a0,x0,1 add x20,x0,x20 ecall addi x8,x8,1 #i=i+1 beq x0,x0,loop exit:
I am guessing the code is supposed to print 0224 however it is only printing 0.
-
Does this statement turn into a nop?
I am wondering if the code below, when the line
#define PRINT_STATEMENT
is commented out, will take many CPU cycles to execute on an ARM microcontroller:#define PRINT_STATEMENT 1 #if PRINT_STATEMENT #define PRINT_DBG(...) printf(__VA_ARGS__) #else #define PRINT_DBG(...) #endif int main(){ PRINT_DBG("Hello World\n"); return 0; }
I tested commenting out the
#define PRINT_STATEMENT
line on an online C compiler and can see that the print statement did not get executed. However, I am still wondering if the program were to be flashed onto an ARM microcontroller (with arm_gcc), will that line be turned into anop
operation?Thank you!
-
Having trouble loading address in ARM
Hi I'm doing a personal project in ARMsim. I am trying to open a document and display a message from the file but an error keeps appearing, "undefined symbol message1." Can anyone help?
.equ SWI_Close,0x68 @ close a file .equ SWI_PrChr,0x00 @write an ascii char to stdout .equ SWI_PrStr,0x69 @write a null-ending string .equ SWI_PrInt,0x6b @write an integer .equ SWI_RdInt,0x6c @read an integer from a fike .equ Stdout,1 @set output goal to be stdout .equ SWI_Exit, 0x11 @prevent execution .global_start: .text @print an preliminary message to the screen mov R0,#Stdout @print an initial message ldr R1, = Message1 @load cope with of message1 label swi SWI_PrStr @display message to stdout ```
-
stm32f1 RTC Alarm
I'm using stm32f103RBT6 and I want to set RTC alarm event every one hour by using codes below
RTC_Alarm_Time.Alarm = 1; HAL_RTC_GetTime(&hrtc,&RTC_Time,RTC_FORMAT_BIN); RTC_Alarm_Time.AlarmTime.Hours=RTC_Time.Hours+1; if(RTC_Alarm_Time.AlarmTime.Hours>23) { RTC_Alarm_Time.AlarmTime.Hours=0; } RTC_Alarm_Time.AlarmTime.Minutes=RTC_Time.Minutes; RTC_Alarm_Time.AlarmTime.Seconds=RTC_Time.Seconds; HAL_RTC_SetAlarm_IT(&hrtc, &RTC_Alarm_Time, RTC_FORMAT_BIN);
my problem is after hour 23 alarm comes at hour 1 and it skips hour 0. I think its because when i set alarm hour to 0 RTC date is still previous day. does anyone has any example of codes that i can make this Independent of date or any other way. Thankyou.
-
PIC ports behaves randomly
I am trying for hours now to light up a simple LED on a button press with no luck. The pins seem to randomly get low and high for some reason.
I am using MPLAB X IDE v5.54 and PIC12F1822. Can anyone spot my mistake?
Here is the code:
// CONFIG1 #pragma config FOSC = ECH // Oscillator Selection (ECH, External Clock, High Power Mode (4-32 MHz): device clock supplied to CLKIN pin) #pragma config WDTE = ON // Watchdog Timer Enable (WDT enabled) #pragma config PWRTE = OFF // Power-up Timer Enable (PWRT disabled) #pragma config MCLRE = ON // MCLR Pin Function Select (MCLR/VPP pin function is MCLR) #pragma config CP = OFF // Flash Program Memory Code Protection (Program memory code protection is disabled) #pragma config CPD = OFF // Data Memory Code Protection (Data memory code protection is disabled) #pragma config BOREN = ON // Brown-out Reset Enable (Brown-out Reset enabled) #pragma config CLKOUTEN = OFF // Clock Out Enable (CLKOUT function is disabled. I/O or oscillator function on the CLKOUT pin) #pragma config IESO = ON // Internal/External Switchover (Internal/External Switchover mode is enabled) #pragma config FCMEN = ON // Fail-Safe Clock Monitor Enable (Fail-Safe Clock Monitor is enabled) // CONFIG2 #pragma config WRT = OFF // Flash Memory Self-Write Protection (Write protection off) #pragma config PLLEN = ON // PLL Enable (4x PLL enabled) #pragma config STVREN = ON // Stack Overflow/Underflow Reset Enable (Stack Overflow or Underflow will cause a Reset) #pragma config BORV = LO // Brown-out Reset Voltage Selection (Brown-out Reset Voltage (Vbor), low trip point selected.) #pragma config LVP = ON // Low-Voltage Programming Enable (Low-voltage programming enabled) #include <xc.h> #define _XTAL_FREQ 8000000 // SET frequency as 8000000 bit because of the 8 MHz clock int main(int argc, char** argv) { OSCCON = 0xF0; // Internal oscillator 8MHz and PLL enabled ADCON0 = 0; // Disable ADC module ANSELAbits.ANSA2 = 0; // SET LED PIN to Digital ANSELAbits.ANSA4 = 0; // SET Button pin to Digital TRISAbits.TRISA2 = 0; // LED bit is output TRISAbits.TRISA4 = 1; // Button bit is input PORTAbits.RA4 = 0; // Button bit low initial PORTAbits.RA2 = 0; // LED bit low initial while (1) { while (PORTAbits.RA4 == 0); // Wait until button is pressed __delay_ms(100); // wait 100ms PORTAbits.RA2 = 1; // Light up the LED __delay_ms(100); // wait 100ms while (PORTAbits.RA4 == 1); // Wait until button release __delay_ms(100); // wait 100ms PORTAbits.RA2 = 0; // Turn off the LED __delay_ms(100); // wait 100ms } return (EXIT_SUCCESS); }
I tried so many things like:
- Tweaking the configs for god knows how many times
- Making all pins one by one input, output, digital, analog etc.
I cannot think for anything else.
Any help will be greatly appreciated.
Edit 1
Here is the latest code that I have. This one turns on and off LED randomly. I have also disabled the Watchdog Timer:
// CONFIG1 #pragma config FOSC = ECH // Oscillator Selection (ECH, External Clock, High Power Mode (4-32 MHz): device clock supplied to CLKIN pin) #pragma config WDTE = OFF // Watchdog Timer Enable (WDT disabled) #pragma config PWRTE = OFF // Power-up Timer Enable (PWRT disabled) #pragma config MCLRE = ON // MCLR Pin Function Select (MCLR/VPP pin function is MCLR) #pragma config CP = OFF // Flash Program Memory Code Protection (Program memory code protection is disabled) #pragma config CPD = OFF // Data Memory Code Protection (Data memory code protection is disabled) #pragma config BOREN = ON // Brown-out Reset Enable (Brown-out Reset enabled) #pragma config CLKOUTEN = OFF // Clock Out Enable (CLKOUT function is disabled. I/O or oscillator function on the CLKOUT pin) #pragma config IESO = ON // Internal/External Switchover (Internal/External Switchover mode is enabled) #pragma config FCMEN = ON // Fail-Safe Clock Monitor Enable (Fail-Safe Clock Monitor is enabled) // CONFIG2 #pragma config WRT = OFF // Flash Memory Self-Write Protection (Write protection off) #pragma config PLLEN = ON // PLL Enable (4x PLL enabled) #pragma config STVREN = ON // Stack Overflow/Underflow Reset Enable (Stack Overflow or Underflow will cause a Reset) #pragma config BORV = LO // Brown-out Reset Voltage Selection (Brown-out Reset Voltage (Vbor), low trip point selected.) #pragma config LVP = ON // Low-Voltage Programming Enable (Low-voltage programming enabled) #include <xc.h> // include standard header file // Definitions #define _XTAL_FREQ 16000000 // this is used by the __delay_ms(xx) and __delay_us(xx) functions void main() { // set up oscillator control register OSCCONbits.SPLLEN = 0; // PLL is disabled OSCCONbits.IRCF = 0x0F; //set OSCCON IRCF bits to select OSC frequency=16Mhz OSCCONbits.SCS = 0x02; //set the SCS bits to select internal oscillator block ANSELAbits.ANSA0 = 0; // Set to digital ANSELAbits.ANSA1 = 0; // Set to digital ANSELAbits.ANSA2 = 0; // Set to digital ANSELAbits.ANSA4 = 0; // Set to digital // Set up I/O pins // PORT A Assignments TRISAbits.TRISA0 = 0; // RA0 = Digital Voltage out TRISAbits.TRISA1 = 0; // RA1 = Digital Voltage out TRISAbits.TRISA2 = 0; // RA2 = Digital Voltage out TRISAbits.TRISA3 = 0; // RA3 = Digital Voltage out TRISAbits.TRISA4 = 1; // RA4 = Digital Voltage in TRISAbits.TRISA5 = 0; // RA0 = Digital Voltage out PORTAbits.RA0 = 0; PORTAbits.RA1 = 0; PORTAbits.RA2 = 0; PORTAbits.RA3 = 0; PORTAbits.RA4 = 0; PORTAbits.RA5 = 0; // Set up ADC ADCON0 = 0; // ADC is off for (;;) { __delay_ms(100); // wait 100ms if (PORTAbits.RA4 == 0) { LATAbits.LATA2 = 0; } else { LATAbits.LATA2 = 1; } } }
Ok, here is what I observed. When the button is pressed the LED never turns off, but when the button is not pressed it randomly turns on or off.
-
connect microcontroller with peripherals
For my workflow I am trying to accomplish the following: I press a button (a physical button) and in response my label printer prints a label with the current date and an unique id that is counting from 1 upwards (kinda like i=i++ if that explanation makes any sense)
Creating the code will not be my problem. I know python well enough to accomplish this.
I am wondering, what the easiest way to accomplish this would be. I don't want a computer, a mouse, keyboard or a monitor in the build. Pressing the button executes the code and prints the label. Disconnecting the device from power shouldn't ruin the whole thing. The counting integer should be preserved. What hardware would you suggest for this?
-
application not working after debug session and reset stm32
I'm new member of this community. I've a custom board using stm32l431cct6 microcontroller and I have a strange iussue after program the boards with Keil uVision or Eclipse IDE. I'm working with a firmware made by another guy, and I saw that debug session works fine. In other words, after Debug session and after Power on Reset, the application works without problmes.
One time, during debug session, the cable with JTAG interface was disconnected. After this event my board had problems with flash programming with Keil or Eclipse. Then the Cortex-M4 processor was in AMR mode, and to solve the situation I used J-Link Commander to unlock the device. After the unlock I was able to re-flash the board with USART probe.
Now this is the situation:
- If I program the firmware with USART using an application based on ST Flash Loader, all works fine.
- If I try to debug my board with Keil, using the same settings made from the other guy, the flash programming is done, and I'm able to debug the code. But after reset (I switch off and on the power supply) the application doesn't start! It seems that's a problem with erase&program of the flash memory. It's strange because before the "accident of the JTAG", all works fine.
Do you have any advice to solve this situation?
-
Cortex-M4 UART1 OK, UART2, UART4 not working
I've been working with STM Cortex-M3, M4 and M7 boards for around eight years using TrueStudio.
The current project is a Chinese Cortex-M4 derivative (TKM32F499GT@240MHz) board with LCD and ESP8266 WiFi module. The ESP is programmed via UART2, and there is a UART4 for general I/O. UART1 is the main serial port and is fully functional.
The board came with a variety of small sample code projects created with Keil MDK. These projects include the build binary that can be loaded on to the board for testing. The Keil sources were ported to TrueSTUDIO 9.2. This required making a generic (i.e. not a STM target) M4 project and adjusting the startup .s and linker script files as needed. I have built ten working apps of varying complexity, including LCD init, graphics, USB CDC, HID and MSD, capacitive touch support and some GPIO control. Please note this code does not use the STM HAL libraries: it uses a version of CMSIS.
So here's the problem.
One of the sample apps configures the ESP8266 via UART2 (AT commands). The Keil binary (7KB) loads and UART1 reports success and immediate expected responses from the ESP via UART2. I can connect a Virtual comm device on to UART2 to monitor the instructions received and sent.
Building the same code in TrueSTUDIO creates a 30KB binary which runs and writes the expected text to UART1. The larger size is probably because my project is Debug mode. What happens is that UART2 is not responding. The UART2 registers are being loaded with the correct info as far as I can tell. Data is arriving at the UART2->TDR but seems not to be transmitted. All the other UART registers have the correct (AFAIK) settings too. The UART2 RCC clock is enabled, with essentially the same code as UART1.
Out of curiosity (and frustration...) I made a minimal new app that wrote text to UART4, with a Virtual comm device attached. Same problem, data is arriving at the UART4->TDR but is not being transmitted.
It's almost like the UARTs are not being enabled in some obscure way. As said above, the Keil binary works fine. It's obviously something pretty obscure, but after more than a week I'm really stuck.
Any ideas where to look next? All suggestions are most appreciated. ******** some code snippets: really the most basic stuff ******
UART_STATUS UART1_Init(int BaudRate) { UART_STATUS rtnStatus = UART_INIT_OK; UART_InitTypeDef UART_InitStructure; GPIO_InitTypeDef GPIO_InitStructure; RCC_AHBPeriphClockCmd(RCC_AHBPeriph_GPIOA, ENABLE); GPIO_InitStructure.GPIO_Pin = GPIO_Pin_9; //uart1_tx PA9 GPIO_InitStructure.GPIO_Speed = GPIO_Speed_2MHz; GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF_PP; // GPIO_Init(GPIOA, &GPIO_InitStructure); GPIO_InitStructure.GPIO_Pin = GPIO_Pin_10; //uart1_rx PA10 GPIO_InitStructure.GPIO_Speed = GPIO_Speed_2MHz; GPIO_InitStructure.GPIO_Mode = GPIO_Mode_IPU; // GPIO_Init(GPIOA, &GPIO_InitStructure); // Now we set the AF mode GPIO_PinAFConfig(GPIOA, GPIO_Pin_9 | GPIO_Pin_10, GPIO_AF_UART_1); //PA9 PA10 RCC_APB2PeriphClockCmd(RCC_APB2Periph_UART1, ENABLE); UART_InitStructure.UART_BaudRate = BaudRate; // UART_InitStructure.UART_WordLength = UART_WordLength_8b;// UART_InitStructure.UART_StopBits = UART_StopBits_1;// UART_InitStructure.UART_Parity = UART_Parity_No ; UART_InitStructure.UART_Mode = UART_Mode_Rx | UART_Mode_Tx; // UART_InitStructure.UART_HardwareFlowControl = UART_HardwareFlowControl_None; UART_Init(UART1, &UART_InitStructure); UART_Cmd(UART1, ENABLE); //UART UART_ClearITPendingBit(UART1, 0xff); return rtnStatus; } // // Pins available on connector P8 // UART_STATUS UART4_Init(int BaudRate) { UART_STATUS rtnStatus = UART_INIT_OK; UART_InitTypeDef UART_InitStructure; GPIO_InitTypeDef GPIO_InitStructure; RCC_AHBPeriphClockCmd(RCC_AHBPeriph_GPIOD, ENABLE); GPIO_InitStructure.GPIO_Pin = GPIO_Pin_7; //uart4_tx PD7 GPIO_InitStructure.GPIO_Speed = GPIO_Speed_2MHz; GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF_PP; // GPIO_Init(GPIOD, &GPIO_InitStructure); GPIO_InitStructure.GPIO_Pin = GPIO_Pin_6; //uart4_rx PD6 GPIO_InitStructure.GPIO_Speed = GPIO_Speed_2MHz; GPIO_InitStructure.GPIO_Mode = GPIO_Mode_IPU; // GPIO_Init(GPIOD, &GPIO_InitStructure); GPIO_PinAFConfig(GPIOD, GPIO_Pin_7 | GPIO_Pin_6, GPIO_AF_UART_2345); // PD7 PD6 RCC_APB2PeriphClockCmd(RCC_APB2Periph_UART4, ENABLE); UART_InitStructure.UART_BaudRate = BaudRate; // UART_InitStructure.UART_WordLength = UART_WordLength_8b;// UART_InitStructure.UART_StopBits = UART_StopBits_1;// UART_InitStructure.UART_Parity = UART_Parity_No ; UART_InitStructure.UART_Mode = UART_Mode_Tx|UART_Mode_Rx; // UART_InitStructure.UART_HardwareFlowControl = UART_HardwareFlowControl_None; UART_Init(UART4, &UART_InitStructure); UART_Cmd(UART4, ENABLE); //UART UART_ClearITPendingBit(UART4, 0xff); return rtnStatus; } int main(void) { uint8_t k; RemapVectorTable(); SystemClk_HSEInit(RCC_PLLMul_20);// PLL 12MHz*20=240MHz == SYSCLK RCC_PCLK1Config(RCC_HCLK_Div4); // sets APB1 to 60MHz RCC_PCLK2Config(RCC_HCLK_Div2); // sets APB2 to 120MHz NVIC_PriorityGroupConfig(NVIC_PriorityGroup_2); UART1_Init(460800); // LED_Init(); LCD_Initial(); // TIM8 is on APB1 @ 60MHz, Div 6000 = 10KHz, 1000 = 1Hz TIM8_Config(6000,10000); LCD_DrawStartScreen(); LCD_ShowFWrev(); UART4_Init(115200); send_str("External INT Demo ready\r\nTry USART4 ...\r\n"); // to UART1 send4_str("Send to USART4 ready?\r\n"); // to UART4 while(1) { // just loop ... } }
The USART4 ports are ID'd from the board schematic. The Virtual USB comm ports are connected to USART1 and USART4. I expected that just sending some text to USART4 would work, but nothing. I do get a very occasional garbage character which usually means the baud rate is wrong. I've looked at the UART4 register settings and it all seems correct as described in the TK32F499 user manual (455 pages).
-
assigning a variable using binary format - 0B causing error in ARM Keil uVision
I am workin on a project with TM4C123GH6PM micro-controller using keil uvision version 4.7. When I assign a value to a variable in binary format like the following:
unsigned char tmp = 0b11000011;
and then I build the project, the following error appears:
expected a ";"
When I change the format to hex -using 0X- or Decimal, the error disappears.
doesn't the compiler in Keil uVision support the binary format?