Skip to content

Commit 46d075b

Browse files
Nick Pigginpaulusmack
authored andcommitted
powerpc: Optimise smp_wmb
Change 2d1b202 ("powerpc: Fixup lwsync at runtime") removed __SUBARCH_HAS_LWSYNC, causing smp_wmb to revert back to eieio for all CPUs. This restores the behaviour intorduced in 74f0609 ("powerpc: Optimise smp_wmb on 64-bit processors"). Signed-off-by: Nick Piggin <[email protected]> Signed-off-by: Paul Mackerras <[email protected]>
1 parent a4e22f0 commit 46d075b

File tree

2 files changed

+6
-2
lines changed

2 files changed

+6
-2
lines changed

arch/powerpc/include/asm/synch.h

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,10 @@
55
#include <linux/stringify.h>
66
#include <asm/feature-fixups.h>
77

8+
#if defined(__powerpc64__) || defined(CONFIG_PPC_E500MC)
9+
#define __SUBARCH_HAS_LWSYNC
10+
#endif
11+
812
#ifndef __ASSEMBLY__
913
extern unsigned int __start___lwsync_fixup, __stop___lwsync_fixup;
1014
extern void do_lwsync_fixups(unsigned long value, void *fixup_start,

arch/powerpc/include/asm/system.h

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,14 +45,14 @@
4545
#ifdef CONFIG_SMP
4646

4747
#ifdef __SUBARCH_HAS_LWSYNC
48-
# define SMPWMB lwsync
48+
# define SMPWMB LWSYNC
4949
#else
5050
# define SMPWMB eieio
5151
#endif
5252

5353
#define smp_mb() mb()
5454
#define smp_rmb() rmb()
55-
#define smp_wmb() __asm__ __volatile__ (__stringify(SMPWMB) : : :"memory")
55+
#define smp_wmb() __asm__ __volatile__ (stringify_in_c(SMPWMB) : : :"memory")
5656
#define smp_read_barrier_depends() read_barrier_depends()
5757
#else
5858
#define smp_mb() barrier()

0 commit comments

Comments
 (0)